일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
- 코딩테스트
- BigData
- 맛집
- Spark
- 개발
- 삼성역맛집
- 백준
- Apache Kafka
- 알고리즘
- 자바
- 코테
- hadoop
- Trino
- 영어
- apache iceberg
- Data Engineering
- 코엑스맛집
- bigdata engineer
- 프로그래머스
- Iceberg
- java
- bigdata engineering
- 코엑스
- 용인맛집
- pyspark
- Kafka
- Data Engineer
- HIVE
- 여행
- 코딩
- Today
- Total
목록HIVE (9)
지구정복
CHAPTER 6 Apache SparkConfigurationConfiguring Apache Iceberg and SparkConfiguring via the CLIAs a first step, you’ll need to specify the required packages to be installed and used with the Spark session. To do so, Spark provides the --packages option, which allows Spark to easily download the specified Maven-based packages and its dependencies to add them to the classpath of your application. ..

A data warehouse acts as a centralized repository for organizations to store all theirdata coming in from a multitude of sources, allowing data consumers such as analystsand BI engineers to access data easily and quickly from one single source to start their analysis The Data LakeWhile data warehouses provided a mechanism for running analytics on structureddata, they still had several issues:..

apache iceberg guidebook에서 가져온 내용입니다. CHAPTER 3Lifecycle of Write and Read Queries Writing Queries in Apache Iceberg Create the TableSend the query to the engineWrite the metadata fileUpdate the catalog file to commit changes Insert the Query Send the query to the engineCheck the catalogWrite the datafiles and metadata filesUpdate the catalog file to commit changes Merge QuerySend the quer..

현재 Openldap과 hue, Hive연동된 상태이다. Hue의 hive editor를 오픈하면 아래 에러가 자꾸 발생했다. Bad status: 3 (b'Error validating the login') (code THRIFTTRANSPORT): TTransportException("Bad status: 3 (b'Error validating the login')") 또한 Hue log를 확인해보니 아래 내용이 있었다.[27/Feb/2025 16:50:17 +0900] base DEBUG Selected interpreter hive interface=hiveserver2 compute=None [27/Feb/2025 16:50:17 +0900] dbms DEBU..