sqoop - Import the results of a query from a relational database into HDFS - apache sqoop - sqoop tutorial - sqoop hadoop



Import the results of a query from a relational database into HDFS

  • Query can be used instead of table in import operation:
  • sqoop import --query 'select Id,Message from TestTable where $CONDITIONS' 
                 --where 'id>100' 
                 --connect "jdbc:sqlserver://192.168.1.100:1433;database=Test_db
                 --username user 
                 -–password password 
                 --split-by id  
                 --target-dir /user/test/ 
                 --fields-terminated-by '\t'
    Click "Copy code" button to copy into clipboard - By wikitechy - sqoop tutorial - team
  • where $CONDITIONS - mandatory even if where condition does not exist
  • split-by - mandatory, specifies the column for the split operation. Used to split tasks in import MapReduce job
  • SQOOP forum

    Sqoop Installation and Download :

    Connecting Sqoop to other databases/datastores :

  • Import data directly into Hive Warehouse
  • Import data from RDBMS to HBase table
  • Import data to new catalog
  • Import the results of a query from a relational database into HDFS
  • Load JDBC Driver
  • Sqoop Export :

  • Sqoop Export basic example
  • Sqoop Import RDBMS Table to HDFS :

  • Sqoop Import RDBMS Table to HDFS
  • Merge data-sets imported via incremental import using Sqoop :

  • Import New as well as Updated Data - lastmodified mode - Append Mode
  • Hadoop with Kite SDK :

  • Kite SDK to demonstrate copying of various file formats to Hadoop
  • Sqoop Advanced :

  • Use Sqoop to copy an Oracle table to Hadoop

  • Related Searches to Import the results of a query from a relational database into HDFS