Joke Collection Website - News headlines - How to insert text into a table in hive

How to insert text into a table in hive

1.hive How to insert data into a table through an insert statement?

Load data from file into table (overwrite, append without overwrite keyword)

The local INPATH 'dim_csl_rule_config.txt' of loading data is overwritten to table dim.dim _ csl _ rule _ config.

-Insert data from the query statement into the table.

Insert the coverage table test_h02_click_log partition (dt) select *

From stage.s_h02_click_log where dt =' 2014-01-22' limit100;

2. How to import the data from excel into the hive warehouse?

How to import the data stored in excel tables into hive, first of all, the premise is that the data type and length in excel tables should be consistent with the field attributes of tables in hive warehouse, otherwise an error will be reported. Secondly, the data in the excel table cannot be directly imported into the hive warehouse because the row format separating fields ending in',' have been defined when the table is built. The treatment method is as follows:

First, save the data in an excel table and convert it into data.csv format. By default, the file is converted to. Csv format is separated by ",". You can use notepad++ to open the data.csv format for viewing. Then we can discuss importing data into the hive warehouse. However, when executing the following import statement, you will find an error and display the wrong file format.

After checking, it was found that the RCFILE storage format was used when the table was created.

Save as inputformat' org.apache.hadoop.hive.ql.io.rcfileinputformat'

output format ' org . Apache . Hadoop . hive . QL . io . rcfileoutputformat '

However, the imported format is a TextFile file, so the error is reported, and the default format of table building is TextFile format.

How to convert files into rcfile file format;

In (1)hive, insert conversion is performed directly through the textfile table, for example, by importing textfile data into rcfile, as shown below.

insert overwrite table _ RCTable partition(dt = ' 20 13-09-30 ')select p _ id,tm,idate,phone from tmp _ testp where dt = ' 20 13-09-30 ';

(2) Use mapreduce to compress ordinary files into RCfiles, and then read RCfiles.

3. How to export hive to a local table and import mysql?

MySQL command line export database:

1, enter the bin folder under MySQL directory: the directory from cd MySQL to bin folder.

When I enter the command line: CD c: \ program files \ MySQL \ MySQL server4.1\ bin.

(or directly add the directory to the environment variable path of windows)

2. Export database: mysqldump -u user name -p database name >; ; Exported file name

Just as I entered the command line: mysqldump -u root -p news > News.sql (you will be asked to enter your password to enter MySQL).

(If you export a single table, just enter the table name after the database name. )

3. You will see that the news.sql file is automatically generated under the bin file.

Command line import database:

1, move. Sql file to bin file, which is a convenient path.

2, the same as the step 1 deduced above.

3, enter MySQL:mysql -u username-p.

For example, I enter the command line: mysql -u root -p (you will be asked to enter the password of mysql after the same input).

4. Create a new database you want to build in MySQL-Front, which is an empty database at this time, such as creating a new target database named news.

5. Input: mysql & gt uses the target database name.

When I enter the command line: mysql & gt uses news;

6. Import file: mysql & gt file name imported from the source;

When I enter the command line: mysql & gt source news.sql

4.Hive several data import methods and dynamic partition, multi-table insertion.

There are three commonly used: 1. Import data from the local file system into the configuration unit table; 2. Import data from HDFS into the configuration unit table; 3. When creating a table, query the corresponding records from other tables and insert them into the created table.

Hive configuration: Hive data file storage directory of HDFS (created automatically by HDFS after Hive is started): HDFS:/usr/Hive/Warehouse Hadoop FS-mkdir/usr/Hive/Warehouse command creates local data storage directory: local:/home/Santiago/data/Hive1.Data is transferred from the local file system. Display database; OKdefaultTime takes: 1.706 seconds, and the acquisition time is: 1 line hive & gt Create table guo_test (name string, string string)> Line format separation & gt Fields ending in','> are stored as text files; Hive & gt display table; OKguo_testTime takes 0.024 seconds, and the acquisition time is 1 line12345678910122. Create the same data table santi @ HDP: ~/data/hive $ lshive _ test.txt santi in the local file. Hive $ cat hive _ test.txt Sandy, you scum 12343. Import data and test hive >; Load the data local inpath'/home/santi/data/hive/hive _ test.txt' into the table guo_test; hive & gtselect * from guo _ testhive & gtDFS-ls/usr/hive/warehouse/Guo _ test; # Hadoop fs-ls/usr/hive/Warehouse found 1 items drwxrwxr-x-Santiago super group 0 20 17-0 1- 14 2 1: 13/ Usr/hive/warehouse/guo _ test 12345678 Foundhive-site, folder guo _ test # Hadoop Select * from guo _ test Well, santi, you scum12345678 found the file written in hive data warehouse in this folder. 【 Note 】 The local data is written successfully, but in the process of importing data from local to Hive table, the data is actually temporarily copied to a directory of HDFS (usually uploading the user's HDFS home directory, such as /home/santi/), and then the data is moved from the temporary directory to the data directory corresponding to Hive table (the temporary directory does not retain data).

2. Import data from hdfs file system into Hive table 1. Create a data file on the HDFS file system. If there is no vim command on HDFS, you need to manually transfer the local data file to HDFS/data/hive # vimdata _ hdtohive/data/hive # catdata _ hdtohive data from. HDFS to hive # Hadoop fs-put/home/santi/data/hive/data/hdtohive/usr/data/input data incoming # Hadoop fs-ls/usr/data/input123452 import data hive > Load the data in path'/usr/data/input/data _ hdtohive' into table guo_test; Hive & gtselect * from guo _ test Fortunately, the information about Hive Anti-Hive came from HDFS-what's wrong with you? Timetaken: 0. 172 seconds. fetched line 123456 data has been successfully written into the data storage location configured by the configuration unit. 【 Note 】 The statement of importing data from local area is hive >;; Load the data local inpath'/home/santi/data/hive/hive _ test.txt' into the table guo_test; The statement of importing data from HDFS is hive & gt loading the data in path'/usr/data/input/data _ hdtoHive' into table guo_test; The gap lies in local command.

When importing from the HDFS system to the configuration unit table, the data is transmitted. No related files were found on the HDFS system.

3. Select data from the hive table and insert it into the new hive table. The command is create table table name, such as selecr xxx from table name. Hive & gt create table hivedata _ test1> As> chooses the name & gt from Guo _ Kao; hive & gtselect * from hive data _ test 1; Ok data from statimetaken: 0.116 seconds, fetched line 123456789 [Note] hive is a partition table. In Hive, each partition of the table corresponds to the corresponding directory under the table, and the data of all partitions are stored in the corresponding directory.

The table has two partitions, A and B, so the directories corresponding to A = XXX and B = XX are/User/Hive/Warehouse/A = XXX User/Hive/Warehouse/B = XX, and all data belonging to this partition are stored in this directory. Hive & gt creates table hivedata _ test2 (> Name string) > partition by> (String)& gt;; Line format separation & gt Fields ending in','> are stored as text files; Hive & gt is inserted into table HiveData _ Test2 & gt partition (string =' best') > Select the name & gt from Guo _ Kao; Hive & gt select * from hive data _ test2ok data from best santi best time taken:1.549 seconds, fetch:2 row(s)# Hadoop fs-ls/usr/hive/warehouse/hive data _ test2 found 1 items drwxrwxr-x-Santiago super group 0 20 17-02- 1 4 17:40/usr/hive/warehouse/hive data _ test2/string = best .

5. How does 5.Hive add table annotation syntax?

To add a comment, just use a single quotation mark as the beginning of the comment text. The comment symbol tells Visual Basic to ignore the content behind the symbol, which is the comment part in the code snippet and displayed in green characters in the code editor.

Comments can be on the same line as the statement, written after the statement, or take up a whole line.

For example:

Enter a welcome message in the text box.

Private subcommand 1_Click ()

Text 1。 Text= "Hello." Set the property of Text 1 to Hello.

End joint

Please note that comments cannot be followed by characters on the same line.