there is a TXT file containing 100 thousand records, which are as follows:
column 1, 2 columns, 2 columns, 3 columns, 4 columns, 5
a 0000313100 adductive#1 adducting#1 adducent#1
a 0000335600 nascent#1
a 0000355300 0.250 dissilient#1
... There are 100 thousand in the back...
requirements are to be imported into the database, the structure of the data table is
word_id auto incremental
word [adductive#1 adducting#1 adducent#1] a TXT record to be converted to 3 SQL records
value = third column - fourth columns; if Php
$file ='words.txt'; TXT source file recorded by //10W
$lines = file_get_contents ($file); / / / / / / / / / / INTO words_sentiment (word, senti_type, senti_value, word_type) VALUES.Quot;
{
$mm=explode. );
$word=$nn.#91; 0.#93;
$sql.=.quot; (\.quot; $word\.quot; 1, $senti_value, 2),.Quot; R />
}
//echo $i;
$sql=substr ($sql, 0, -1); / / / / remove the last comma
?.gt; [/code]
copy code 1. When mass data is imported, we should pay attention to some of the restrictions of PHP. R />file_put_contents ()
3, when the mass import, the best batch import, the probability of failure a little
4, before the mass import, the script must be repeatedly tested and used again, such as 100 data to test the
5, if the PHP mem_limit is not enough, the program still can not run. To
(suggest modifying php.ini to improve mem_limit, instead of using temporary sentences) to find audio-visual finishing (www.zhaoyingyin.com).