The source code for this is on Github. Also read the Graylog docker installation docs first.
Graylog has dependencies on Mongo and Elasticsearch which I didn't know. Elasticsearch is cool.
Build the node image with:
docker build -t bunyan-node .
Then run docker compose up build.
Testing the endpoint using http is easy:
curl -XPOST http://localhost:12201/gelf -p0 -d '{"short_message":"Hello there", "host":"example.org", "facility":"test", "_foo":"bar"}'
However, as we are using streams you have to set up Graylog to accept UDP streams. For this example I've used a module called gelf-stream.
const http = require('http');
const bunyan = require('bunyan');
const gelfStream = require('gelf-stream');
const stream = gelfStream.forBunyan('graylog')
const log = bunyan.createLogger({
name: 'myapp',
streams: [{
type: 'raw',
stream,
}]});
http.createServer((req, res) => {
const then = Date.now();
log.info(Hello ${then}
);
res.end(Hello ${then}
);
}).listen(8001);
Voila.
References
- Graylog at docker hub
- Bunyan Node module
- Rising stack
- Real time data