Sqoop application_import Hive

4.5 Full import of hive tables 4.5.1 Import text table # import command sqoop import \ –connect jdbc:mysql://nn1:3306/sqoop_db”?useUnicode=true &characterEncoding=UTF-8″ \ –username root \ –password 12345678\ –table goods_table \ –num-mappers 1 \ –delete-target-dir \ –hive-import \ –fields-terminated-by “\001” \ –hive-overwrite \ –hive-table hainiu.goods_table The above process is divided into two steps: 1) The first step is […]

[Big data Hive’s peripheral tool Sqoop application integration]

Directory Preface: Hive’s peripheral tool introduction and detailed information on its application scenarios: Server command realizes Sqoop integration Hive, Mysql realizes data export Integrated Hive Integrate MySQL Java Springboot framework integrates Sqoop, Hive, Mysql to realize import/export of hive POM dependencies: Add the following configuration to the application.properties file: Code: Command meaning interpretation: Controller implementation […]

[Big data Sqoop, hive, HDFS data operation]

Directory Preface: Sqoop integrated Hive, HDFS realizes data export rely: Configuration file: Code: Controller calls: Linux command import and export: Use Sqoop to import data into Hive tables. For example: Use Sqoop to export data from Hive table to MySQL. For example: Use Sqoop to import data into HDFS. For example: Use Sqoop to export […]

Big data environment construction Hadoop+Hive+Flume+Sqoop

Table of Contents zero: release notes 1. Install CentOS Two, Hadoop stand-alone configuration 3. Hive installation and deployment 4. Install and deploy Flume and Nginx Five, Sqoop installation Zero: Release notes Hadoop: 3.1.0 CentOS: 7.6 JDK: 1.8 1. Install CentOS There are a lot of online tutorials here, so I won’t post pictures [The memory […]

Full disclosure! Sqoop command execution for the perfect landing of big data from 0 to 1

Sqoop command execution Common command execution parameters Import and export can be performed by adding different parameters through Sqoop, and common command lines can be viewed through sqoop help #Common Sqoop parameters [root@qianfeng01 sqoop-1.4.7] sqoop help codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL […]

Sqoop of big data technology

Chapter 1 Introduction to Sqoop Sqoop is an open source tool, mainly used for data transfer between Hadoop (Hive) and traditional databases (mysql, postgresql…), which can transfer data from a relational database (such as MySQL, Oracle, Postgres, etc.) The data is imported into Hadoop’s HDFS, and HDFS data can also be imported into a relational […]