Event Driven Architecture with Oracle EDN

"The flying Dutchman" (see also photo page)

Key Takeaways
  • Event Driven Architecture helps with decoupling applications

  • Think about the names used for events
  • Let Publisher publish and filter within subscribers
  • Use meta data within event headers
  • Use standardization on entities within Event, so that the Event is a first citizen data object


There is a lot of hype around Microservices and the use of events for implementing the choreography pattern. However this is nice for companies like Netflix and Twitter, but there are a lot of organisations still struggling with files and ESB like products. Also my current client uses an ESB namely the Oracle SOA Suite 12c for integrations. We cannot just throw away this ESB, but we can make use of the event mechanism built in. This blog describes the way we use the EDN (Event Delivery Network) component, that is used within SOA composites to throw events and to subscribe on events.


Oracle has a component that you can use to publish events and to subscribe on events within a SOA composite. Just use the invoke activity with the eventname and the content of the event. Within a composite you can subscribe on events and set filters. You can also configure “oneAndOnlyone” consistency property and indicate if you want a durable or non-durable subscriber. The EDN hides the underlying JMS provider, which can be changed (weblogic jms or OracleAQ). Separate Topics can be defined for each event or just use 1 topic for all events.
  • Applications must always be abstracted by a corresponding SOA Composite. Applications should not use JMS directly 
  • EDN cannot be used directly from within Oracle OSB


We use the following definition for an event. The event is based on business entities and the operations executed on those entities. We use the following definition:

BusinessObject    The name of the business entity, i.e. Invoice, Customer
Operation         Created, Updated, Deleted
TrackingId        Tracking ID so that flows can be followed
SourceSystem      The system that published the event
TargetSystem      The target system to which the event must be sent explicitly. This is
an optional field
Identifiers       This is used to add meta data, i.e. invoice number, location, order number
Message           The message. This contains the content of the business entity

Master data entities
This works fine for master data entities, because you want to use events to indicate that entities are created, updated or deleted. We use this to decouple the MDM solution from the interested applications. New applications are easily added by making a SOA composite subscriber for that application.

Transactional data
In this case the solution falls short, because the context is missing. For example when an Invoice is received from a customer, you want to send the event InvoiceReceivedFromCustomer. In our case we use the Created operation on Invoice but this does not say much. Within the meta data we had to add more context information.

Lessons learned
  • Think about the names you use for Events, i.e. Invoice_Create is too vague and will need more metadata to filter on the correct event
  • Think about the data you put in the message
  • Master data entities can be especially good candidates to publish

Data will change

The content of entities will change, i.e. fields are added. After this change you want to synchronize the interested applications with this new information. This is especially needed for master data. So you are decoupled on the publishing- and subscribing applications, but still strongly coupled to the data. You need to synchronize the new dataset to the applications (batch). There are several options:

·         Use ETL to synchronize the data

This can be a good option in case it is lots of data. Extra tooling and data transformation is needed, which you tried to avoid. The source and target systems are again coupled.

·         Publish all data and make use of the Pub-Sub implementation
This works fine in case there are less entities. You don’t need extra ETL tooling but re-use the publish-subscribe mechanism. We use this and also use the targetSystem field within the event, when we know it is only relevant to a particular application(s).

Lessons learned
  • Think about data reconciliation in case data fields are changed
  • Try to avoid ETL because this couples the source and target systems again


What I see often in integration projects, is that each interested application a new integration is built particularly for that application. For example a new supplier wants to retrieve Invoice messages from the company. However it wants to retrieve the invoice in its own format (for example csv) and based on its own protocol (for example sftp). The application, that generates the invoices, just exports yet another file. What I propose is to let the publishing application generate Invoice events. It should have no knowledge about the interesting applications. It just has to do the jobs. Then for each supplier make a Subscribe composite that filters the Invoices targeted for that supplier and map the Invoice to the suppliers format and protocol. This is where the Subscriber composite filters come in.

With the filter you can define which events you want to receive. Note that you can also indicate if you want to have a Durable subscriber. In that case you receive events in case the composite was stopped.

Lessons learned
  • Let Publishers publish all data and let Subscriber filter the correct data
  • Filter on the meta data part of the event and not on the event message content

Error handling

Functional- and technical errors are unavoidable. This also is the case with events, but with the extra complexity that the publisher is decoupled from the subscriber(s). This means that in case the event is published from the source, the work is done. But still errors can occur within the subscriber both functionally as technically. you have to think of the following scenario's:
  • Do i want to be able to re-publish the same event?
    The consequence is that target systems must implement idempotency
  • Does the sending application need to know that all targets have successfully handled the event?
    I my opinion you should try to solve the problem at the target, because otherwise you introduce strong coupling again. This also depends on the business requirements of course.


There are more topics that can be discussed, i.e. event versioning, data security. Maybe another time.
As always, feedback and questions are very welcome !

Event driven architectures can be a good solution to decouple applications! We are now able to connect new applications faster than before, because of the Pub-Sub pattern and the standardization of the event data.
Think about the events you publish, because otherwise you still need much of filter logic within the subscribers. Also think about the way you want so solve data synchronization/reconciliation in case the data is changed.


Microservices mindmap

"The tree" - See also  my photo page

When you are fairly new within the Microservices land, there are a lot of terms fired at you. So also for my own understanding i have made a mindmap. I think it has a good status now, so that i can share it with you. As always feedback is very welcome !

You can download the mindmap here.


Cloud to Cloud Application Integration

A lot of applications have integration possibilities, so do cloud applications. The question I got from a customer is whether to have a point-to-point integration with Cloud applications or to go through their ESB solution. This blog describes some considerations.

The customer has a HRM application in which job vacancies are managed. Furthermore that system also handles the full applicant process flow. They also have another cloud application that handles the job vacancies. This application posts the jobs to social sites and other channels to promote the vacancies. Furthermore this application has some intelligence for job seekers to advice some new vacancies based on previous visits or profiles. The job vacancies need to be sent to the Vacancies application and applicant information needs to be sent to the HRM application, when a job seeker actually applies for a job. Furthermore status information about the job application is also sent back. This flow of information is depicted in the next figure.

Integration options
Several options are discussed to integrate these cloud applications.
The two applications have an out-of-the-box integration possibility, so integrating this way is easy.

However also consider:
  •          Dependent on the cloud suppliers to integrate on infra level
  •          Monitoring is not conform the guidelines of your company. Maybe there is no monitoring at all.
  •          Exception handling is not conform the guidelines of your company, so what happens when something goes wrong on integration level
  •          What happens when one cloud application is migrating to another version? Maybe the other application must migrate as well, and maybe this is not what you want.
  •          Security: what are the security possibilities of the applications and the integrations? Is this conform company standards? Is this conform regulation requirements?

This option is viable if it is seen as one application function, in which you can see it as 2 modules within the same application. Furthermore integration is taken care of by the two cloud suppliers.
Enterprise Service Bus
In case more control is needed for the integration and job vacancies are also interesting for other applications to integrate (job boards), the ESB solution can also be an option. A lot of companies are still using ESB i.s.o. Microservices architecture. They still have a lot of old applications which need adapters to decouple the systems.

Advantages of this solution:
  •          Use of common business entity on the ESB to decouple applications
  •          Monitoring conforming company guidelines
  •          Exception handling conforming company guidelines
  •          Reuse of services possible
  •          Reuse of events for other interested applications
  •          Company can decide which parts of the application APIs are used

Note that the ESB can be cloud and on-premise.
  •          Extra layer
  •          Company must setup the infrastructure connections themselves

Depending on your requirements, guidelines and timing constraints, there are always several options for integrating cloud applications. Considerations:
  •            Flexibility information delivery (decoupling)
  •            Own connection infrastructure control
  •            Monitoring requirements
  •            Exception handling requirements
  •            Security and compliance requirements
  •            Upgrading cycles of cloud applications

Feedback is always welcome!