sqoop - sqoop import to hive - apache sqoop - sqoop tutorial - sqoop hadoop



sqoop import to hive

  • Sqoop import tool’s main function is to upload your RDBMS data into files in HDFS.
  • If you have a Hive metastore associated with your HDFS cluster, Sqoop can also import the data into Hive by generating and executing a CREATE TABLE statement to define the data’s layout in Hive.
  • Importing data into Hive is as simple as adding the --hive-import option to your Sqoop command line.
  • $ sqoop <<tool-name>>  \
    		{generic- arguments} \ 
    		{hive-arguments}
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team
    learn sqoop - sqoop tutorial - sqoop2 tutorial - data ingestion tool - sqoop job - hadoop - hive - sqoop import mysql  - sqoop code - sqoop programming - sqoop download - sqoop examples
  • Sqoop Tool => create-hive-table
    • The create-hive-table tool populates a Hive metastore with a definition for a table based on a database table previously imported to HDFS, or one planned to be imported.
    $ sqoop create-hive-table (generic-args) (create-hive-table-args)
    $ sqoop-create-hive-table (generic-args) (create-hive-table-args)
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team
    Example:
    
    $ sqoop create-hive-table \
    --connect “jdbc:mysql://localhost/<<DB-Name>>” \
    --username root –P \
    --table << Table-Name >> \
    --hive-table <<DB-Name>>.<<Table-Name>>
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team
  • Apache Sqoop => HIVE Import
    • If you want to move your data directly from structure data store to hive warehouse you can use –hive-import.
    • If table already exist in the database & you want to overwrite its content then use –hive-overwrite.
    • If table definition does not exists in hive warehouse then use --create-hive-table.
     
    $ sqoop import \
    --connect  “jdbc:mysql://localhost/classicmodels”
    --username root –P \
    --table employees \
    --target-dir /usr/hive/warehouse/<<db-name>>.db \
    --fields-terminated-by  ","  \
    --hive-import \
    --create-hive-table \
    --hive-table <<DB-Name>>.<<Hive-table>> \
    
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team

    Important points to note down on sqoop

  • Sqoop cannot import data into hive, if file format is set to as Avro data file {--as-avrodatafile} or Sequence file format {--as-sequencefile}.
  • Limitations of Hive’s input parsing abilities:
  • If you do use --escaped-by, --enclosed-by, or --optionally-enclosed-by when importing data into Hive, Sqoop will print a warning message.
  • Hive will have problems using Sqoop-imported data if your database’s rows contain string fields that have Hive’s default row delimiters (\n and \r characters) or column delimiters (\01 characters) present in them.
  • You can use the {--hive-drop-import-delims} option to drop those characters on import to give Hive-compatible text data.
  • sqoop import \
    --connect jdbc:mysql://localhost/classicmodels \
    --username root -P \
    --table employees \
    --columns <<column-name>>\
    --Where "<<condition>>'" \
    --create-hive-table \
    --hive-import \
    --hive-table employees \
    --hive-partition-key “<<conditioned-column-name>>" \
    --hive-partition-value "<<condition-column-value>>"
    
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop mapreduce - sqoop job - sqoop code - sqoop programming - sqoop download - sqoop examples
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop data transfer - sqoop job - sqoop code - sqoop programming - sqoop download - sqoop examples
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop data transfer - sqoop job - sqoop code - sqoop programming - sqoop download - sqoop examples
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop data transfer - sqoop job - sqoop code - sqoop programming - sqoop download - sqoop examples
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop data transfer - sqoop job - sqoop code - sqoop programming - sqoop download - sqoop examples
  • Sqoop doesn’t support dynamic partition so you need to import mostly by using where clause or else to achieve dynamic partition you can use Hcatalog.
  • learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop import to hive - sqoop code - sqoop programming - sqoop download - sqoop examples
    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop import to hive - sqoop code - sqoop programming - sqoop download - sqoop examples

    learn sqoop - sqoop tutorial - sqoop2 tutorial - sqoop import to hive - sqoop code - sqoop programming - sqoop download - sqoop examples


    Related Searches to sqoop import to hive