IBM z/OS Connect  /  Designing the IBM z/OS Connect user interface

Designing the IBM z/OS Connect user interface

Overview
This section of the case study explores the design and development process of the IBM z/OS Connect user interface.

To set the scene, IBM z/OS Connect is a well established product with 7+ years of investment on the Eclipse IDE platform. In 2019 the decision was made to shift the product to the web based experience, enabling the product to expand into new markets, and speeding up our users time to value. For full details of the design research that led to this product change, please read Defining the future direction of IBM z/OS Connect.
Key objectives
  1. Shift the z/OS Connect product experience from an Eclipse IDE to a web based platform.
  2. Enhance the product capabilities (compared to the existing Eclipse product)
  3. Create a simple and intuitive user interface
  4. Offer a familiar experience to other products in IBMs integration portfolio (eg. IBM App Connect).
To narrow the scope, this case study will specifically focus on the creation of Db2 APIs in the UI (a database running on z/OS). This was the first z/OS subsystem supported in the new web based tooling and it thereby served as a basis for the product experience.
Product research
As-is scenario - Db2 API creation
The first stage of the design process was mapping out the users journey through the existing z/OS Connect eclipse interface. This ensured we had a clear understanding of how to create a Db2 API in the existing tooling, and served as a basis for future discussion with the clients and team.
Our objective was to understand how this experience could be improved
  1. Who was involved?
  2. What were our users core objectives?
  3. What could be done to simplify this experience?
  4. What are our users common pain points?
  5. What are the limitations of this experience?
  6. Where do users typically get stuck?
  7. What features/ functionality is used/ unused?
To answer these questions we undertook research calls with our clients, development team, and our technical sellers. This ensured we captured feedback from multiple stakeholders, and engaged the clients and team early in the development process. We asked the interviewee to guide us through the process of creating a Db2 API on their machine, then asking targeted questions when appropriate. A limitation of this approach was that the client/ team member were typically speaking from past experiences rather than creating a Db2 API for the first time.
Key findings
  1. When creating a Db2 API two personas were involved:
  1. A database admin was responsible for create a Db2 service interface from the database SQL.
  1. A Z application developer was responsible for creating an API interface from the Db2 service interface.
  1. The database admin often struggles to create a Db2 service interface using the command line tool.
  2. The Z application developer can struggle to discover the available Db2 service interfaces.
  3. The Z application developer often struggles to manage the two API artefacts (.sar and .aar)
  4. Having multiple artefacts means there is multiple mapping steps which feel redundant
  5. Multiple artefacts must be kept in step (same version)
Aligning product experience
The second stage of the design process was to explore the other products in IBM’s integration suite. The objective of this research was to understand what UX patterns had been used, then utilise similar patterns to create a familiar product experience where possible.

To achieve this we mapped out the product experience for multiple IBM products including IBM App Connect, and IBM API Connect.
This research proved invaluable as it enabled us to shortcut many of the lessons learned from other team. It also facilitated us in building strong working relationships with other departments of IBM. This later led to the sharing of designs and code which further accelerated our time to market.
Key findings
  1. IBM App Connect had created two patterns which were perfect for our use case:
  1. The ‘perscriptive canvas’ which enabled users to quickly assemble APIs in a visual format.
  2. The ‘cognitive mapper’ which enabled users to create complex mappings between two end-points, utilising a technology called JSONata (developed by IBM)
  1. IBM API Connect had an experience for creating OpenAPI Documents. An experience we could integrate if necessary.
  2. A design guild existed that aimed to bring IBM integration products together. A guild which we later joined.
OpenAPI first creation
The next stage of research was to build a better understanding of the ‘contract first’ process. This was a key requirement identified during the preliminary research whereby users wanted to build an API starting with an OpenAPI contract that had been defined by the business.
Key questions
  1. Who was responsible for defining the OpenAPI contract?
  2. What stakeholders were involved?
  3. How long would it take to create?
  4. If changes were required, who would make them?
To answer these questions we undertook further research calls with our clients. This topic proved to be far more difficult to research as clients were at varying stages of adopting this strategy.
Key findings
  1. Business analysts are responsible for identifying the API requirements and producing a technical specification (as an OpenAPI document or as a written document)
  2. API specifications take 1-2 months to complete.
  3. Z application developers will receive a pre-defined OpenAPI document to work with. Alternatively they’ll have to translate a technical specification into an OpenAPI document.
  4. API security standards are defined by the Enterprise security architect
  5. Application owners will determine who can invoke the API.
  6. An OpenAPI document is a living document and will change as the requirements mature.
  7. If changes are required to the OpenAPI document, they’ll have to be signed off by the business.
Samples slides
[Slides from: 2021-04-08-design-partnership-playback-deck]
Interface design
To-be user journey
Our design research later informed the creation of our to-be user scenarios which outlined the steps of creating a Db2 API in the tooling.
Design exploration
With a clear understanding of our requirements we began exploring possible designs for the interface. During these early stages we explored the different scenarios to give us an overall understanding of how the end product might fit together. We did this to ensure we weren’t design ourselves into a corner as more functionality was added in the future.

Throughout the design process we shared the designs with our clients to get their feedback. While we received some great feedback from clients, it became increasingly evident that our clients saw these screens as the final designs rather than early mock-ups. We concluded as a team that we needed to share low-fidelity mockups with the clients to better validate the flow.
Team critique
Critique sessions were organised with the development team throughout the design process. These sessions gave the team awareness of the direction we were moving in and provided a forum for the team to give feedback at the earliest possible point. These sessions were fundamental to ensure the designs were technically feasible, and allowed the team to start thinking about implementation.

As a team to we painstakingly critiqued each screen, assessing the ordering of steps, the language used, and the UX patterns. After each session we’d collate the feedback and create the next iteration of the designs.
Feature prioritisation
Having created a vision of the overall product, we sought to narrow our scope and define a minimum viable product (MVP). I.e. the smallest thing we could deliver that could provide our users with value. As a team concluded to firstly build the ‘contract first flow’, and removed a lot of the non-critical functionality.
Low-fidelity prototype
Having defined our MVP, we began to build a low-fidelity prototype. By building in low-fidelity we hoped to focus our users on the core steps of the flow, and give them the perception that this was not a finished product.
User testing
Iteration 1
Once the designs were refined we began testing. These early user tests sought to validate basic navigation of the UI, and the ensure the API visualisation matched our users mental model. We conducted these tests with a selection customers and IBM stakeholders. These tests were carried out by supplying our users with a written scenario, then asking them to complete the task in the prototype provided. We asked our test subject to talk aloud throughout the test so we could better understand their thought process.
Number of participants
4 Customers
6 IBM stakeholders
Key findings
  1. Users were able to navigate the interface with relative ease
  2. Users understood the mental model of the API visualisation (see fig X)
  3. There was confusion over some terminology used throughout the prototype
  4. The OpenAPI document is continually evolving and changing. Customers would need the option to re-import and merge changes.
  5. One customer highlighted that we were missing vital information about the API parameters. This information was critical when defining a mapping between two fields.
Quotes
  1. "In my opinion you’re going in the right direction" —
  2. “How can I quickly test this API in a tool such as postman?” ——
  3. “I’d like to be able to import the z/OS data structures from zFS (z file system).
  4. “How will this looks when there is 100 fields”.
  5. “I’d like to be able to expand and contract the data structures”.
  6. “I’d be interested to see an example COBOL structure with hundreds of fields to see if the current representation is scalable”
  7. “Is there an option to create the API Botton up” [from the back-end application]
  8. “When importing an OpenAPI document, it would be good to have an import from Github option”
Feedback prioritisation
As a team we gathered to discuss the feedback we received from our clients and IBM stakeholders. While all the feedback was valid, we had to make a decision what could be implemented in the short term, and what would be added to the backlog. We had constraints on time and development resource and therefore had to define a Minimal Viable Product (MVP) that would serve as a foundation for future development.
Key thoughts
  1. The ability to test the API from inside the tooling was feasible and could be included in the MVP.
  2. The ability to import directly from zFS was seen as a huge advantage, however it was a large development undertaking and would not be included in the MVP.
  3. Integration with Github was a large development undertaking and would not suit all clients. It would not be included in the MVP.
  4. API creation and API generation were already key experiences present on the future roadmap.
  5. Updating the API document would not be suitable for MVP due to the development overhead. This would be placed on the future roadmap.
Iteration 2
In light of our feedback and subsequent prioritisation, we created the next iteration of the prototype. We later carried out the same test with the same subjects to validate our improvements. During these tests we also shared our prioritisation grid with the clients to communicate our current vs future intentions.
Refinements made
  1. Test functionality added to tooling.
  2. Improvements to the terminology used in prototype.
  3. Ability to expand/ collapse data structures
  4. Size and padding of elements were minimised to increase the number of elements on screen.
  5. Use of a large data structure in the prototype
  6. Ability to see the API parameters used for mapping.The ability to test the API from inside the tooling was feasible and could be included in the MVP.
  7. The ability to import directly from zFS was seen as a huge advantage, however it was a large development undertaking and would not be included in the MVP.
  8. Integration with Github was a large development undertaking and would not suit all clients. It would not be included in the MVP.
  9. API creation and API generation were already key experiences present on the future roadmap.
  10. Updating the API document would not be suitable for MVP due to the development overhead. This would be placed on the future roadmap.
Key findings
  1. Customers were generally understanding of our prioritisation and MVP.
  2. The importance of importing from zFS was re-iterated.
  3. Auto save functionality needed more visual clarity.
  4. An IBM stakeholder (App Connect) highlighted that our API visualisation was not consistent with theirs (who we were trying to emulate).
Hi-fidelity designs
Having tested multiple iterations of the lo-fi designs we were confident to begin creating hi-fidelity designs. These designs would introduce stylised elements based up IBM’s Carbon Design System. We continued to refine these designs based upon the feedback we had received from the previous round of testing.
Refinements made
  1. The API visualisation was updated to feature a single node to align with App Connect.
  2. Savings states added to the UI
User testing
As with the low-fidelity designs, we continued to test the hi-fidelity designs with a group of clients and IBM stakeholders. In the same format as our previous tests we provided the test subject with a scenario and a prototype, then asked them to complete the given task. To give the perception that these were unfinished designs, we converted each screen to greyscale.
Number of participants
6 Customers
5 IBM stakeholders
Key findings
  1. Customers clearly understood the mapping interface (borrowed from API Connect)
  2. Customers expressed confusion over the change I terminology between the two product versions.
  3. Customers and IBM stakeholder struggled to define the response rules
  4. Customers and IBM stakeholders we confused by the comment button [mapping interface]
Test limitations
A limitation of our testing was the shortcomings of a clickable prototype. During the tests our users wanted to explore the UI freely, but the prototypes were only built with the golden path in mind. It was decided from this point that it would be more valuable to test and refine the coded solution.
*Multiple iterations of the high fidelity designs were designed and tested.
Design specification
After further refinement and testing, the high-fidelity designs were ready for development. To aid our developers we created detailed specifications of each component and how this was satiated in the flow. The specifications would include the following:
  1. A clickable prototype
  2. A document detailing the component and their different states, edge cases, etc.
  3. A redline file
  4. Accessibility considerations
Development
Development was a very collaborative exercise, and by no means a ‘throw it over the wall” process. Throughout the development process we maintained close communication with our development team, we were always on hand for calls, always testing early prototypes, and often re-factoring the design to capture details that had been overlooked.

We’d produce a number of artefacts with the development team including; component reviews, accessibility reviews, and test plans to ensure the product was well implemented.
Component review
Accessibility review
Continue reading