tag:blogger.com,1999:blog-8716531089719420013.post3373471785767104729..comments2024-01-15T11:45:52.649+05:30Comments on Big Data and Cloud Tips: Using Log4J/Flume to log application events into HDFSPraveen Sripatihttp://www.blogger.com/profile/11782284194201977787noreply@blogger.comBlogger8125tag:blogger.com,1999:blog-8716531089719420013.post-46326977848244889552018-04-05T17:15:37.460+05:302018-04-05T17:15:37.460+05:30Hi
Please do following changes. we also faced th...Hi <br /><br />Please do following changes. we also faced the same issue and resolved it.<br /><br />1) Change the operator to !=<br />2) Put condition string in double quote.<br /><br />As below :<br /><br /> if [[ $line != "^java\.library\.path=(.*)$" ]]; then<br /><br /><br />It will work !!!!<br /><br /><br />Anonymoushttps://www.blogger.com/profile/00110468972058532680noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-35918529461083550462014-10-08T16:02:59.519+05:302014-10-08T16:02:59.519+05:30getting below error
$ bin/flume-ng agent --conf ./...getting below error<br />$ bin/flume-ng agent --conf ./conf/ -f conf/flume.conf -Dflume.root.logger=DEBU<br />G,console -n agent1<br />bin/flume-ng: line 82: conditional binary operator expected<br />bin/flume-ng: line 82: syntax error near `=~'<br />bin/flume-ng: line 82: ` if [[ $line =~ ^java\.library\.path=(.*)$ ]]; then<br />'Thinkhttps://www.blogger.com/profile/09147225047350439539noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-48251666515338056022014-04-01T05:54:21.617+05:302014-04-01T05:54:21.617+05:30I seem to get this error
2014-04-01 00:23:19,393 (...I seem to get this error<br />2014-04-01 00:23:19,393 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:422)] process failed<br />java.lang.VerifyError: class org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AppendRequestProto overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;<br /><br />NY IDEA WHY ?Random Musingshttps://www.blogger.com/profile/11812755002947171606noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-76250987567391275482014-02-23T17:50:27.766+05:302014-02-23T17:50:27.766+05:30Nice post!!Nice post!!Ivanhttps://www.blogger.com/profile/17280147893617620209noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-34407345166286198152014-02-07T20:58:03.463+05:302014-02-07T20:58:03.463+05:30Did you keep opening up your port using this comma...Did you keep opening up your port using this command? nc -l -p 41414rawspirit2010https://www.blogger.com/profile/02857131747942961817noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-13827060727093139252014-02-07T10:22:23.532+05:302014-02-07T10:22:23.532+05:30Do we need to run Avro client and Log4j client as ...Do we need to run Avro client and Log4j client as well to make sure HDFS source is listening and receiving, I understand that through log4j ..log goes to flume.log and how do you see the message inside flume/events folders , is that due to Avro client and /etc/passwd file as Sequence file and what do you meant by Sequence filerawspirit2010https://www.blogger.com/profile/02857131747942961817noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-84081042162482182442013-11-21T16:51:40.923+05:302013-11-21T16:51:40.923+05:30Irrespective of Flume running or not, the log4j ju...Irrespective of Flume running or not, the log4j just hangs in Eclipse.Praveen Sripatihttps://www.blogger.com/profile/11782284194201977787noreply@blogger.comtag:blogger.com,1999:blog-8716531089719420013.post-25814923868347097272013-11-19T20:23:00.465+05:302013-11-19T20:23:00.465+05:30May be a dumb question, is port used by avro sourc...May be a dumb question, is port used by avro source is open? At least with CentOS the ports are not open by default. We faced similar issue when we tried to distribute flume flow across nodes. We had flume agent reading log file on one node and avro source and sink on the other. Flow failed as port was not open. - MarutiMarutihttps://www.blogger.com/profile/04186629701406100612noreply@blogger.com