Device simulator

Developing applications to respond to device data is really hard if you don't have the device sending the right data to test all use-cases. This helped to solve that problem.

Design team

Just little old me

Date

2016 for 2 weeks

Simulating fake devices

Due to structural changes, I was one of two left on the IoT Platform design team. This resulted in being 100% responsible for a this design delivery, for both experience and interface. Over a 2 week sprint I managed to take a 'simulated devices' concept from an idea, through low, mid, and eventually to hi-fidelity designs that were delivered to development. I also produced work beyond the original scope in order to get improved UX designs on the roadmap for future deliveries.

Mid 2016 there was a huge push to improve the developer experience of Watson IoT Platform. After ideation workshops we ended up with multiple ideas with varying degrees of feasibility and varying degrees of impact.

The concept

One of the ideas that was received well was the idea of virtual/fake devices that rendered in the product as if they were real. This would allow users to get started more quickly and explore the product with 'data', but most importantly would enable device developers to interact with the platform APIs and return data without actually having access to their physical devices.

As an aside

To do this we used a big ideas workshop, with prioritisation at the end. This produced a matrix of low to high impact, and low to high feasibility. The ideas that landed in high impact with high feasibility are the 'no-brainers'. This is where simulated devices sat and why it was picked up as a piece of work.

This was a quick win as 80% of developers weren't able to work with their raw device data.

This idea came to be known as 'Simulated Devices'. With a targeted turn-around of only 2 weeks, I saw this as a good opportunity to own a part of the product, do some UX work and capitalise on the great relationships I'd built up in the development team.

I worked with product Management to come up with the following problem statement:

"Users are unable to learn and explore the APIs without having to do a large amount of work, such as connecting up real devices, or leaving Platform and using tools such as Node-Red". Off the back of this we defined two user stories.

User stories

User story : Simulating data for a registered device.

As a developer (either experienced or novice) I want to use the behaviour of a known device, and have the IoT platform simulate the device behaviour and data schema.

User story : Simulating multiple new devices and data

As an application developer and tester I want to test my application and want the platform to simulate multiple devices of a known type and device behaviour (schema).

Exploring the flow. Lo-fi

We wanted the result of 'Simulated Devices' to feel like a user had a real device sat in the corner of their browser, one which they can access and interact with as if they had a physical device next to them. It became obvious that this therefore needed to exist as a persistent component, accessible to all users at all points within their journey through the UI.

Producing these lo-fi walkthroughs of the initial experience highlighted just how complex we could go in terms of 'depth' of the simulations. Sharing with the wider team, we began to scope back the level of customisation at these points. In part this was to play nicely with development, reducing the complexity of the build, considering the extremely short turnaround we had on this piece, but from a pragmatic design view, it also made sense to start providing users with a more simplistic level of detail, and over time ramp this up (rather than the other way around).

I also prototyped the interaction to confirm that the real-estate I was exploring in the lo-fi was feasible.

Increasing fidelity: Mid-fi

Exploring the override issue.

One of the other problems the lo-fis highlighted was a complexity issue with overriding the 'templates'. The basic concept of a 'Type' in the product was that it was a higher level container (for want of a better word) for 'Instances' of that type. The issue that was uncovered by the lo-fis was that without the ability to set up events at the type level, you would theoretically have to set up events separately, for each individual simulated instance.

Using mid-fi wireframes (for the purpose of usability testing) I explored the varying levels of complexity if we just did all the setup at a type level, and instances were simply pulling on that template (the green outline). However talking this through with customers, it was clear they wanted more granular levels of control, this was a simplification too far.

The abundance of screens after the green outline was the result of enabling the UI to the allow customisation at both the instance and type levels, this is because we had to handle the behaviour of overrides. The above flow was the most simple experience I could come up with, but in follow up customer presentations, did seem to provide customers with the required granularity, and so the increased development effort was justified and the required resource allocated.

What we learned
Below are a couple of usability changes that were made as a result of testing the mid-fis with sponsor users.

Phase 1. Sample Hi-fi

The intention of phasing this work for development was that in the 2 week sprint that we had to design and develop this, we wanted to make sure that we weren't constraining the desired experience and functionalities based on our tight timeframe. Phase 1 was intended for this delivery. Phases 2 + 3 were added to a roadmap, but we designed as part of the initial 2 week sprint so that eventually they can be implemented without any further design investment.

Phase 2

Shortcuts

This work was done as part of a bid to improve the 'Developer Experience' (with specific reference to a users first few visits to the product). Phase 1 was validated with existing device developers as being a powerful resource for testing, however, it was overlooking the inexperienced application developer, such as our persona Chris.

Shortcuts looked to solve this:

Scenario: Chris is building an external application that takes data from Watson IoT Platform.

This simulation concept was applied in many areas of the Platform. Basically whenever data is expected, but not available or non-existent the Platform offers the ability to simulate data in its absence. Clicking each of these effectively runs a script which opens the simulator and creates a simulation.

What we got wrong

Unfortunately due to technical constraints the simulations were local to a user's own account and not stored centrally.

To go some way to alleviating this, I designed this export/import experience which took the simulated data configuration and allowed it be shared across teams, albeit in quite a static way.

Import/Export

Phase 3

Event composer and visualiser

One of the main challenges for the development team in the short timeframe was building on the React-base code editor to allow for custom 'functions' inside of the configuration. The initial proposed design was cognisant of this and so was a relatively bare bones editor, simply allowing code input.

For phase 3, I designed a version that used the same principles as the inline modal, providing increased benefits, such as more context, detail and control.

After user testing all options, we went for option 3 (which the development team were thrilled with due to it being significantly less work!)

As an aside
If I were to revisit this piece of work, I would explore the idea of making this entire environment 'no-code'. Yes, the targeted persona was a developer, but using a JSON structure for this still meant they had to learn the required syntax, when in reality we could have just given them a dedicated UI to create these simulations.

Visualiser mid-fi

Returning to the idea of simulated devices feeling like 'a device sat in the corner of the product', I looked to create a more 'visual' way of tweaking the simulation data. Below, Chris is able to set up a way of manually sending events using more visual paradigms, rather than editing the values in the code.

When combined with shortcuts, this would allow a new user to trial the functions of the IoT Platform without taking the time to hook up a real device. They could test rules and alerts, or see how their dashboard responds in different scenarios, without touching a single line of code.

Additional changes based on user feedback

Scheduling

One of the main issues was that people weren't realising the schedule accordion. This meant they were missing out on the ability to change how often events from this particular type were sent. This meant that they were limited to the default, which was 1 event every minute.

This fix was to make use of a new component I had been designing for somewhere else on the Platform. It was a tertiary button which had a tooltip style interaction. The benefit of using this tertiary was that we could present the user with a simple statement of what is currently set, but use the increased real-estate afforded by the interaction to allow for further set up.