Hello All,
I want to ingest csv file into Hive table. But, csv file format changes frequently i.e I get different file each time as a csv. I would like to develop a workflow which handles any kind of csv file. Does nifi provides any such options? if yes, what should be the process that needs to be followed?
@Shu,
Hi Shu, need your help. I have to ingest csv to hive and below is the format
getFile -->inferavroschema-->convercsvtoavro-->putHDFS-->ReplaceText-->PutHiveQL
I am getting data till putHDFS and it is in avro format. I am using replacetext to create INSERT script its not fetching any records.
avrofile looks like this
[{"name":"field_0","type":"string","doc":"Type inferred from 'ID'"},{"name":"field_1","type":"string","doc":"Type inferred from 'CITY_NAME'"},{"name":"field_2","type":"string","doc":"Type inferred from 'ZIP_CD'"},{"name":"field_3","type":"string","doc":"Type inferred from 'STATE_CD'"}]
using below insert command
INSERT INTO aaa (field_0, field_1, field_2, field_3) VALUES ('${field_0}', '${field_1}', '${field_2}', '${field_3}')
but its not working. sql statement generated is,
insert into aaa values (,,,)
Answer by Mr Anticipation ·
Hello Matt, need your help on the above. I tried multiple things and nothing seems to be working. I am using nifi 1.8
when i use JSONTOSQL, i get insert into aaa values (?,?,?,?) how do i pass the values here?
This website uses cookies for analytics, personalisation and advertising. To learn more or change your cookie settings, please read our Cookie Policy. By continuing to browse, you agree to our use of cookies.
HCC Guidelines | HCC FAQs | HCC Privacy Policy | Privacy Policy | Terms of Service
© 2011-2019 Hortonworks Inc. All Rights Reserved.
Hadoop, Falcon, Atlas, Sqoop, Flume, Kafka, Pig, Hive, HBase, Accumulo, Storm, Solr, Spark, Ranger, Knox, Ambari, ZooKeeper, Oozie and the Hadoop elephant logo are trademarks of the Apache Software Foundation.