Agile Zone is brought to you in partnership with:

Steve has posted 1 posts at DZone. View Full User Profile

Architecting the Enterprise with Savara

03.01.2010
| 10446 views |
  • submit to reddit

This article provides a high level view of the motivations behind the Savara open source project, and the Testable Architecture methodology upon which it is based. To best explain these motivations, we will discuss the project in terms of four properties that are desirable in the development of any Enterprise Architecture. We will also be discussing a case study concerning a global insurance company, that were evaluating the potential benefits of a Testable Architecture approach, by having two independent teams define their architecture, one using the Testable Architecture approach and the other using more traditional methods.

Co-Authored by Gary Brown, Senior Software Engineer at Red Hat 

 

Accuracy - “But does it do what I want” said the customer to the sales clerk.

Accuracy is all about making sure software systems do what users and organisations need them to do. The trick is how can we measure accuracy at different stages in the development of a solution rather than wait until it is delivered. We are used to applying manual governance in which people make judgements as to the accuracy of intermediate artefacts along the way.

In an industry that prides itself on automation, we are lacking in the utilisation of the same ethos of computable automation in how we govern and measure accuracy of such intermediate artefacts. This is the ethos behind SAVARA, where each stage of the Software Development Lifecycle (SDLC) is validated to ensure it conforms to artefacts in preceding stages, to check that each stage accurately reflects the original business requirements. This ensures misalignment to the business requirements is detected at the earliest possible stage as opposed to detecting problems during unit and system testing.

The computable checking that is necessary to verify that artefacts are aligned is what we mean by the term Testable Architecture. Thus Testable Architecture augments the testing of code by promoting testing to the requirements gathering and design stages of the SDLC, between requirement and model artefacts which collectively may be considered as the architecture description.

It is perhaps salient to reflect on what we would have to do manually to ensure that code is aligned. Imagine checking the code line by line with a large pad of paper trying to figure out and trace a specific test case that represents some business requirement. In the 1970's this is exactly what we did prior to compiling a single program from a set of punched cards or paper tape. We printed the code and did it line by line because the on-line testing was not something that we could do.

The advent of more powerful languages and compilers with type systems coupled with on-line editors and access to compute power changed this and led to more automated assistance. Testable Architecture takes this evolution to the next level of refinement and focusses on direct alignment between designs and requirements.

The subject of our case study is a company embarking on a business process led service enablement of their IT landscape, to make it more agile and able to deal with new demands far faster than they can today. They are looking to automate where possible based on a Service Oriented Architecture.

Typically they run multiple work streams to deliver on their vision and each work stream starts by gathering more detailed requirements that document the flow of information between services. In most cases the services do not exist but some legacy applications had already been service enabled to the extent that they have a decoupled service contract that defines what they can provide.

The requirements are manually checked and used to frame a solution. The solution delivers the set of services
needed and any flow between the services where they are required to collaborate. Having set off a fairly typical approach to the modeling of a solution, which would then be used to drive the delivery of that solution, the only means of ensuring that the model meets the requirements and that the requirements themselves are consistent and implementable, as a collective, is through manual inspection.

When the same company used the Testable Architecture methodology and the initial tools from SAVARA, they found that they were able to test the accuracy and consistency of requirements and indeed found requirements that were ill-formed and not implementable relative to other requirements. They found that using the testable architecture approach of testing models against requirements provided a proven and accurate solution.

 

Efficiency - “But does it do it quickly enough without unduly increasing my electricity bill” said the customer to the sales clerk.

Automation drives greater efficiency in many industries when applied properly. When considering how we may apply automation to the Software Development Lifecycle (SDLC), we need to ensure that we understand the SDLC in it’s many varients, to support the gathering of information that frames a solution, as well as the creation of intermediate artefacts that help drive a solution and the solution (the code) itself. SAVARA aims to deliver strong and detailed contracts between each stage of the SDLC, as well as between individual components (i.e. services) within the architecture, meaning there is less room for ambiguity, making the development process more efficient (especially when dealing with a large and geographically distributed development team).

The company involved in the case study found that using the Testable Architecture methodology to model their architecture, and test those models against the original business requirements, significantly improved their efficiency by reducing the time to go from requirements to a model of the solution and the artefacts needed to drive delivery by 80%.

The normal method, for doing the same, required the Domain Experts (DE), Business Analysts (BA) and the architect to schedule meetings and deal with the reqirements in isolation from the design. As the design was constructed, additional meetings followed and issues such as requirement conflicts and ambiguity dealt with. As a result the normal method was slow (because it was subject to manual checking) and not amenable to a faster
sprint with higher bandwidth of exchanges between the DEs, BAs and the architect.

In the Testable Architecture approach the tools were deployed on a laptop in a conference room, with the DEs, BAs and the architect. A projector was used to share the session, which involved normalisation of requirements in parallel with modeling of the solution. It was achieved over a three day period in which the model was tested against requirements at each stage and then new requirements were normalised, the model extended and tested
again against the new or modified requirements. The ability to collaborate with the DEs and BAs, through the assisted automation that the tools provided and guided by the methodology, led to a more focussed and far more rapid experience that resulted in a better solution much faster.

 

Quality - “I don’t want the extended warranty, I just want it to work for the next 2 years” said the customer to the sales clerk.

We are often asked to buy the extended warranty. In software systems we accept defects and yet call them bugs to hide the fact that something clearly went wrong. Facing up to defects is an important focus for the innovation behind SAVARA. If we can enforce a higher degree of automation, gain efficiencies in delivery and ensure accuracy along the way, the quality should go up.

We don’t pretend that defects will simply become non-existant because requirements may be wrong. What we do aim to do is to increase the overall quality of the solution and remove many if not all of the major design flaws that cost so much when solutions are delivered. A key part of SAVARA is to ensure quality in a computable way, supported by appropriate methodologies and processes.

For example, the service 'unit tests' can be automatically derived from the requirements (i.e. scenarios), removing the need to manually create tests based on a potentially incorrect interpretation of the business requirements. This ensures defects are detected at the earliest possible stage. Any stage in the SDLC where people are required to create artefacts from imprecise information, such as unstructured documentation for requirements or architectural specifications, has the potential for mistakes to be made. Enabling component
testing to be driven from the original captured business requirements ensures that the services correctly implement those requirements.

Through the use of the Testable Architecture methodology and the SAVARA tools, the company associated with the case study discovered two major design flaws that the more classic approach failed to spot. One flaw was the incorrect identification of common services which was rectified mid-stream and the other flaw was in the use of a legacy service in which the service failed to distinguish error returns from valid returns, and so required more
complex mediation to propogate the errors into WSDL faults.

The latter was identified because it led to an error detected in the testing of the architecture against the
requirements that was based on the underlying identity of a message relative to the business transaction. The former was identified because Testable Architecture forces you to decompose discrete behaviors. A salient thought is that the cost of remediation of these two design flaws, which ultimately would have given a low quality solution, saved almost 25% of the entire budget for the workstream.

 

Fidelity - “Will it work the same when I take it home” said the customer to the sales clerk.

When we go into production, where a software system is made operational, it would be of huge benefit if the system operates as described by the design and as framed by the requirements. And if those requirements change over time, it would be nice to know the impact, and that when those changes are deployed, that they will also perform as expected.

The use of SAVARA to monitor production environments protects that fidelity of operation against requirements, by ensuring that the executing solution continues to behave according to the architecture and original business requirements, even where some of the components (i.e. services) are obtained from third parties.

Without the Testable Architecture approach, it would be difficult to monitor a runtime environment to understand the status of a business transaction, or more importantly to detect that a problem has occurred. This is due to the lack of a validated global model description of the architecture. Some solutions are available that are based on rules being used to detect certain specific situations (e.g. using Complex Event Processing - CEP), and then perform some remedial actions. However these are driven from manually defined rules, that may be incomplete or inaccurate. The other approach that can be used is to monitor new business transactions based on templates created by monitoring past transactions. Although this can be useful, it makes the assumption that previous transactions were correct, and that future transactions will not change - which they invariably do in an evolving business.

The key point about Testable Architecture is that everything is driven from the original business requirements, which in turn are used to create a validated global description of the architecture. This architectural description represents the flow of the business transactions across all components in the architecture, and can include other non-functional requirements, such as SLAs, that can also be monitored and enforced at the global level. Using this as the reference model for monitoring an architecture means that as changes are introduced, the reference model will always show the up to date flow that should be executing. It also means that monitored business transactions can be viewed against the original architectural specification, enabling business users to pose questions in a business context.

 

Summary

SAVARA is an open source initiative that aims to provide end to end tooling support and a methodology to guide the use of those tools to assist Enterprise Architects, Subject Matter Experts, Business Analysts, Solution Architects and Developers in the framing and delivery of complex distributed systems that form a business solution. SAVARA supports early testing of design against gathered requirements to ensure formal alignment of requirements and design artefacts as well as providing run time monitoring of the delivered solution against the validated artefacts.

In one case study the net effect was an 80% reduction in the cost of requirements gathering to the definition of the technical contract that drove the delivery, and an indirect saving of 25% across the entire SDLC through early defect detection. SAVARA embodies the four key properties associated with good solutions, namely
Accuracy, Quality, Efficiency and Fidelity.

Published at DZone with permission of its author, Steve Ross-talbot.

(Note: Opinions expressed in this article and its replies are the opinions of their respective authors and not those of DZone, Inc.)