With the advent of IoT & big-data, the use cases for event processing should be shooting all time high. Recently, i have started to explore more on the Complex event processing (CEP) which is a subset of Stream processing. While I am still trying to understand the basic difference between Stream processing and Complex event processing, the idea of Pattern matching in CEP looks so attractive for me. So, i have decided to have hands-on it.
Http(s) is a fantastic protocol we have around. As Http connections are stateless & media independent, we recently saw trend for REST services over SOAP services. My current project expose most of the services as Http. So, when coming to monitoring these services, we had handful of options to choose from. However, the data in the access logs is a sweet spot.
The access logs has most of the critical information that are really necessary by default log format : Common Log Format. Some are client information(ipAddress), Url, Http status (200, 404, 500,etc), size of the payload,. But, if we need additional parameters that needs to be logged, we need to use Extended Log Format (ELF). ELF allows us to write a customized logic by implementing the java class : weblogic.servlet.logging.CustomELFLogger and over-riding logField().
If you need to change the format, you can do it from console by following clicks : Home >Summary of Servers > <name of the server> –> Logging –> HTTP –> Click on Advanced –> Format.
Remember that, weblogic comes with the default settings of ‘Log File Buffer’ as 8kb. This means that some access logs are stored in memory (ie. the size we set in the field buffer) before it is really available in access.log file. Of course, we can change the buffer size to 0 to force the weblogic to write the log information in the file.
Below are the some of the article which will help you to understand better :
- Enable & configure Http Logs.
- Setting up access logs with little more information.
- AMIS’s Extended log format using customized field.
- Access log not being written immediately.
I am working on Oracle’s stream processing to forward the logging information to event processor to have some advanced monitoring & alert in place. I will write more on this topic soon. :).
Oracle SOA suite comes with variety of adapters. Among which, FTP & File adapters are widely used in larger enterprise while we had to work with files. Here, we are going to use the concept of valves/Pipeline for zipping files post processing. However, the usage is not just restricted to zipping. Think about some of the well-known use cases like encryption/decryption, compressing/decompressing, validating, transform, filter, etc, etc. Of-course, these can be achieved post & pre processing the data in FTP & File adapter.
Continuing my previous post where we saw how to de-queue a message from Oracle Stream Advanced queuing, this post will help us to understand the basic of how to enqueue a message into AQ.
Lets gets our hands-on by defining a payment process for our Newspaper Shop with Oracle SOA-BPEL 12C. We are going to expose a Http service for letting the customer to pay his invoice.
If you are into integration world, you must have come across the word ‘Queuing’. An Architectural pattern which allows us to achieve de(or loose)-coupling and agility in real world. De-coupling is the measurement of dependency between the actual service provider and the service consumer(s). For a better function in SOA, this level of dependency should be low as possible. By Queing, the service consumer can publish the request & make them available for the provider which may or may not be available at when the request is published.
Oracle’s AQ (Advanced Queuing) is one of the extremely powerful (and less used) Oracle Database message queuing based on Oracle Streams. It is an Oracle’s DB feature which is said to be scalable and reliable as weblogic’s JMS. This post is not so much into de-coupling but the basic implementation of Oracle AQ adapter (more details) in SOA-12c.