Enterprise Integration Zone is brought to you in partnership with:

Mark is a graph advocate and field engineer for Neo Technology, the company behind the Neo4j graph database. As a field engineer, Mark helps customers embrace graph data and Neo4j building sophisticated solutions to challenging data problems. When he's not with customers Mark is a developer on Neo4j and writes his experiences of being a graphista on a popular blog at http://markhneedham.com/blog. He tweets at @markhneedham. Mark is a DZone MVB and is not an employee of DZone and has posted 553 posts at DZone. You can read more from them at their website. View Full User Profile

Micro Services: The Curse of Code Duplication

11.29.2012
| 8482 views |
  • submit to reddit

A common approach we’ve been taking on some of the applications I’ve worked on recently is to decompose the system we’re building into smaller micro services which are independently deployable and communicate with each other over HTTP.

An advantage of decomposing systems like that is that we could have separate teams working on each service and then make use of a consumer driven contract as a way of ensuring the contract between them is correct.

Often what actually happens is that we have one team working on all of the services which can lead to the a mentality where we treat start treating the comunication between services as if it’s happening in process.

One of the earliest lessons I learnt when writing code is that you should avoid repeating yourself in code – if you have two identical bits of code then look to extract that into a method somewhere and then call it from both locations.

This lesson often ends up getting applied across micro service boundaries when we have the same team working on both sides.

For example if we have a customer that we’re sending between two services then in Java land we might create a CustomerDTO in both services to marshall JSON to/from the wire.

We now have two versions of the ‘same’ object although that isn’t necessarily the case because the client might not actually care about some of the fields that get sent because its definition of a customer is different than the provider’s.

Nevertheless if we’re used to being able to working with tools like IntelliJ which let us make a change and see it reflected everywhere we’ll end up driving towards a design where the CustomerDTO is shared between the services.

This can be done via a JAR dependency or using something like git sub modules but in both cases we’ve now coupled the two services on their shared message format.

I think the ‘duplication’ thing might be less of an issue if you’re using a language like Clojure where you could work with maps/lists without transforming them into objects but I haven’t built anything web related with Clojure so I’m not sure.

As I understand it when we go down the micro services route we’re trading off the ease of working with everything in one process/solution for the benefits of being able to deploy, scale and maintain parts of it independently.

Perhaps the problem I’ve described is just about getting used to this trade off rather than holding onto the idea that we can still treat it as a single application.

I’d be curious to hear others’ experiences but I’ve seen this same pattern happen two or three times so I imagine it may well be common.

Published at DZone with permission of Mark Needham, author and DZone MVB. (source)

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)

Comments

Florin Jurcovici replied on Fri, 2014/08/15 - 4:47am

Age-old dilemma: staying DRY leads to strong coupling.

A microservice could expose a description of its API as WSDL or JSON-WSP. Clients can then parse responses and build requests according to that description, without a need to share code. Sharing code is really-realy bad - it creates implementation coupling. A service description shares protocol only - loose coupling.

Added benefit: I never liked DTOs. They are an abomination, from the point of view of object orientation - they split functionality and data. Using schema-validated JsonObject or DOMNode as serialization formats for your real, functionality-rich classes avoids this abomination, and lets you have a proper domain model in your application. When using DTOs, the temptation is high to implement an anemic domain model.

Idea makes it easy for you to act as if there was an implementation coupling where there should be none. But IMO that's not a good thing, and you shouldn't mis-use this powerful IDE feature to introduce/enforce such a coupling. 


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.