Today, I come back with another step in our connection between our TIBCO BW 6.x environment and the ELK stack to extract all the power about the log information we have inside our TIBCO BW components. And we started with the TEA component. In the previous post we get a sucessfull connection to ELK stack with the information inside the TEA. But we use the default message formt so, we cannot use the custom parts the TIBCO log file have inside. Continue reading
If you have been followed these post series as a result of the last post we launch our ELK stack using docker, so we have all our components up and running to start integrating both of these worlds. And we are going to start with the log files from the TIBCO Enterprise Administrator, TIBCO TEA.
In the previous post we talked about how is going to be our architecture to squeeze all the information that’s inside our log file using the ELK stack. We also exaplined the main components involved and their role in this architecture. Now, we are going to start to build it, and we are going to start launching our ELK stack.
As you could guess from the title or event for the last post series we are going to use Docker to do this. But, this time we are not going to create any dockerfile. We are going to use a community image to do that. As we said in the last post, the ELK is an open-source stack so there are several images that could do the work for us.
The idea of this post comes from one comment in my last post, as you can see here. In that post we were talking about different log capabilities that you can do now in the new TIBCO TEA Administrator tool. But, this user, ask why talking about the log capabilities the product OOTB provides you when you could integrate your traces with a great component stack to squeeze all the information these logs have. This stack was the ElasticSearch (ELK) stack, and I think: “That is a great idea”. So, I’m going to create a post series to explain how to integrate your logs with this stack.