From 1d609fc5ab9b5d1c89d5d7d8f321c4e8493b4437 Mon Sep 17 00:00:00 2001
From: Rainer Gerhards
Date: Fri, 1 Oct 2010 10:27:44 +0200
Subject: omhdfs: some cleanup of build system
but still pretty ugly. Any java folks out there to help clean it up?
---
doc/omhdfs.html | 49 +++++++++++++++++++++++++++++++++++++++++++++++++
1 file changed, 49 insertions(+)
create mode 100644 doc/omhdfs.html
(limited to 'doc/omhdfs.html')
diff --git a/doc/omhdfs.html b/doc/omhdfs.html
new file mode 100644
index 00000000..827697b6
--- /dev/null
+++ b/doc/omhdfs.html
@@ -0,0 +1,49 @@
+
+rsyslog output module for HDFS (omhdfs)
+back
+
+
+Unix sockets Output Module (omhdfs)
+Module Name: omhdfs
+Available since: 5.7.2
+Author: Rainer Gerhards <rgerhards@adiscon.com>
+Description:
+This module supports writing message into files on Hadoop's HDFS
+file system.
+
Configuration Directives:
+
+Caveats/Known Bugs:
+Building omhdfs is a challenge because we could not yet find out how
+to integrate Java properly into the autotools build process. The issue is
+that HDFS is written in Java and libhdfs uses JNI to talk to it. That requires
+that various system-specific environment options and pathes be set correctly. At
+this point, we leave this to the user. If someone know how to do it better,
+please drop us a line!
+
In order to build, you need to set these environment variables BEFORE running
+./configure:
+
+- JAVA_INCLUDES - must have all include pathes that are needed to build
+JNI C programms, including the -I options necessary for gcc. An example is
+# export JAVA_INCLUDES="-I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/include -I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/include/linux"
+ - JAVA_LIBS - must have all library pathes that are needed to build
+JNI C programms, including the -l/-L options necessary for gcc. An example is
+# export export JAVA_LIBS="-L/usr/java/jdk1.6.0_21/jre/lib/amd64 -L/usr/java/jdk1.6.0_21/jre/lib/amd64/server -ljava -ljvm -lverify"
+
+
+Sample:
+
+
+
+[manual index] [rsyslog site]
+This documentation is part of the rsyslog
+project.
+Copyright © 2010 by Rainer Gerhards and
+Adiscon.
+Released under the GNU GPL version 3 or higher.
+
+
--
cgit
From d5f16404f93d54afddebb9fb683469fc712d2335 Mon Sep 17 00:00:00 2001
From: Rainer Gerhards
Date: Fri, 1 Oct 2010 17:26:14 +0200
Subject: omhdfs: added doc
---
doc/omhdfs.html | 28 ++++++++++++++++++++++++----
1 file changed, 24 insertions(+), 4 deletions(-)
(limited to 'doc/omhdfs.html')
diff --git a/doc/omhdfs.html b/doc/omhdfs.html
index 827697b6..3849f167 100644
--- a/doc/omhdfs.html
+++ b/doc/omhdfs.html
@@ -12,8 +12,19 @@
file system.
Configuration Directives:
-- $...
-option...
+ - $OMHDFSFileName [name]
+The name of the file to which the output data shall be written.
+
+- $OMHDFSHost [name]
+Name or IP address of the HDFS host to connect to.
+
+- $OMHDFSPort [name]
+Port on which to connect to the HDFS host.
+
+- $OMHDFSDefaultTemplate [name]
+Default template to be used when none is specified. This saves the work of
+specifying the same template ever and ever again. Of course, the default
+template can be overwritten via the usual method.
Caveats/Known Bugs:
@@ -23,7 +34,8 @@ that HDFS is written in Java and libhdfs uses JNI to talk to it. That requires
that various system-specific environment options and pathes be set correctly. At
this point, we leave this to the user. If someone know how to do it better,
please drop us a line!
-In order to build, you need to set these environment variables BEFORE running
+
+- In order to build, you need to set these environment variables BEFORE running
./configure:
- JAVA_INCLUDES - must have all include pathes that are needed to build
@@ -32,12 +44,20 @@ JNI C programms, including the -I options necessary for gcc. An example is
- JAVA_LIBS - must have all library pathes that are needed to build
JNI C programms, including the -l/-L options necessary for gcc. An example is
# export export JAVA_LIBS="-L/usr/java/jdk1.6.0_21/jre/lib/amd64 -L/usr/java/jdk1.6.0_21/jre/lib/amd64/server -ljava -ljvm -lverify"
+
+ - As of HDFS architecture, you must make sure that all relevant environment
+variables (the usual Java stuff and HADOOP's home directory) are properly set.
+
- As it looks, libhdfs makes Java throw exceptions to stdout. There is no
+known work-around for this (and it usually should not case any troubles.
Sample:
-