2021-2022
Redesign of energy and automation software +
Human Machine Interface
Product Design





Project Overview
Schneider Electric is a French multinational company that specializes in digital automation and energy management. I interned with the L&T Electrical and Automation unit of Schneider Electric as a UX Research and Design intern in 2021 and was offered yet another position by my seniors as a Product Design intern in 2022. L&T Electrical and Automation was recently acquired by Schneider Electric so most of the software required rebranding. I worked across five projects - Smartcomm ABT, Smartcomm PMS, iMMR Suite, HES/MDP, and SmartComm Configurator. My job involved redesigning software and research as well as the design of a Human Machine Interface (HMI) device.
My Role:
Product Designer
From:
January 2022
To:
May 2022
UX Researcher and Designer
August 2021
October 2021
I will give a walkthrough of SmartComm ABT redesign process. Since all the other 4 projects and design of the Human-Machine interface are under a strict NDA, please reach out to me directly if you have any questions regarding the same.
SmartComm ABT
SmartComm ABT (Availability Based Tariff) is a B2B SaaS solution that is used for remote monitoring of meters deployed pan-India.
Why this redesign?
-
Major company changes
-
Make the user feel in control
-
Design was outdated
-
Increase business agility
-
Usability issues
-
Raise the ceiling for site performance
Old SmartComm ABT dashboard design:

Few emotions that I encountered after having a look at the dashboard for the first time -



Redesign Process
I used the 4-step redesign strategy to redesign SmartComm ABT. The steps were as follows:
Research
Bricklaying
Gap Analysis
Build & Deploy
Research
Before beginning with the redesign, it was really important for me to understand what is working for the user and what is not working. So, for this, I conducted thorough qualitative research.
​
01. Stakeholder Interview
I divided the stakeholders into 3 different categories, each one from a different team.
​
​
​​
​​
Next, I asked them 6 overarching questions to get a sense of how each one of them views this as a problem:
​
-
Why are you considering this redesign?
-
What are the major problems according to you?
-
Who are the users?
-
How will you define success here?
-
What has already been achieved?
-
What do you aim to achieve with this redesign?
​
These questions were intentionally made open-ended to gain as much insight from the stakeholders as possible.



Design
Engineering
Business
02. User Interview
I spoke to 8 users to understand the problems they are facing with the software and how is it affecting their productivity on a day-to-day basis. Out of these 8, 5 of them were old employees and 3 of them were recently hired ones.
​
-
What is your name and age?
-
What is your job role?
-
How long have you been working with SmartComm ABT?
-
Tell me about the first time you worked on SmartComm ABT's dashboard. What was the experience like? What issues did you face?
-
Tell me about the process that you followed to learn how SmartComm ABT works.
-
Which part of SmartComm ABT do you use the most?
-
What are the issues that you faced initially?
-
What are the issues that you face currently?
-
Tell me about the last time you were trying to rectify an issue on the dashboard and got stuck. How did you go about solving it?
-
If you could change 3 things about SmartComm ABT, what will they be?
-
Tell me about the steps that you will follow if I asked you to teach a newbie about this software.
-
Any suggestions on how this software can be improved?
-
What other projects have you worked on? Did you face similar issues with them? If yes, how did you go about it?​​
03. Affinity Mapping
After conducting stakeholder and user interviews I created an affinity map to group the interview data. This helped me understand the main themes and issues.​​

Insights from the interviews:
01. No hierarchy
People have no clue what is located where. Everything on the software is just scattered randomly.
02. Outdated Design
There are contrast and visibility issues with this software. Many people even complained of a severe headache after working on it for longer durations.
03. Wrong mental model
Old users have just gotten 'used to' using this software since they have been doing it for a long time now. New users face a lot of difficulty in navigating through it.
Bricklaying
01. Grouping
Even as a designer with an electronics background, it was extremely difficult for me to understand the use of each function in the software. The menu items were also not generic which made it even more difficult to group content properly. The initial dashboard design had no hierarchy it was extremely difficult to understand the relative importance of each element in the dashboard. So, I started off by conducting discussion sessions with stakeholders and current users to group parts that were related to each other in some way.

Brainstorming with stakeholder and current user on how to group information
02. Defining Hierarchy
Next, to define a hierarchy, I discussed with users directly to understand which features they use the most and which ones the least. I also asked them to categorize them into different groups.

I made four groups and color-coded them this way:
1. Red - most imp information
2. Green
3. Blue
4. Yellow - least imp information

1
2
3
4

03. Storyboarding
As part of empathizing with the users, I considered the scenario of Sahaj, who is a new employee and is expected to work on SmartComm ABT but has no clue how to go about it.

04. Brainstorming and Sketching
Next up, it was time to brainstorm and make sketches of the dashboard. For this, I started off by making low fidelity paper sketches.


05. Medium Fidelity Wireframe
From the sketches, I moved on to make medium-fidelity wireframes to share with my team.


Gap Analysis
My team and I went back and forth discussing multiple ideas. After each iteration, there was a feedback session to fill in any gaps and improve the design further. We had 3 iterations in total. Below I've discussed the thought process for each design decision:
01. Prototype
Iteration 1
This is the first design that I created. The navigation pane was moved from the top to the left and given a solid dark blue background with white text for proper visibility. The sections decided during the "Defining Hierarchy" phase were placed as cards one below the other from 1 to 4.


Feedback:
The whole team loved the new design because they found it to be very clean and systematic. However, they thought that the navigation pane on the left will take up too much space which can be used for monitoring content instead.
Iteration 2
In the second iteration, the navigation pane was removed and only icons were placed with the icon name expanding as the user hovers over the menu. This way, for most of the time, the majority of the screen space was covered by monitoring content and it was made to shrink when the user hovers over the menu icon, making the navigation space expand.
Usability Test and Feedback:
This iteration was approved by most team members but we were not sure how it would come up on larger screens so we conducted a usability test. This design was tested with 4 participants and we observed them by using the "thinking out loud" usability test. The biggest issue with this design was that on expanding, the navigation was creating a lot of negative space which looked odd. On larger screens (diagonal screen size of 23.5 inches or more), we observed intense saccadic movement on the user's part so we decided to rework on the navigation pane.
Iteration 3 - The Winning Design
Now, since the icon names were not generic, it was not possible to discard them completely. Moreover, the idea of having a navigation pane or an expanding navigation pane had to be discarded because of the issues mentioned above. So, I finally considered using 'tooltips' instead. This might not have worked on smaller screens but, since we were working on screen sizes larger than 23.5 inches, the menu item names were very clearly visible.
02. Final Usability Test
Finally, the 3rd iteration was tested on 8 participants to gauge how the new design works out. Participants were asked to perform a few tasks such as finding the "Alarms" option, identifying the current block detail, responding to emergencies, etc. Gladly, all the participants were able to breeze through these tasks this time which made the whole team finalize this design.
Build and Deploy
After gaining positive results on the final usability test, the 3rd iteration was finalized and ready to be deployed. Following are the screens:


Impact
-
Observed a significant 90.41% reduction in bounce rate.
-
The new navigation design led to a 15.8% increase in conversion rate during the first few days.
-
Observed a 4.3% increase in feature adoption rate during the first week(5-minute to 15-minute interval feature at the top).
​
Office Visit
During the office visit, I was introduced to various types of Human Machine Interfaces (HMIs) such as the annunciator and industrial panel control like push button, selector switch, panel interface, voice interface, alarms, etc.


Fig 1: The iVision max Learning Kit with integrated 7" LCD display HMI


Fig 2: Volt-Amp-Frequency (VAF) Meter with the digital operator

.jpg)
Fig 3: Fast Device Replacement (FDR) operator display


Fig 4: 12 and 6 window annunicator with LED, voice and alarm feature
Learnings
-
During the office visit, I was introduced to a variety of Human Machine Interfaces (HMIs). This also taught me how they're used in real-world scenarios and the design considerations that go into creating them. I also observed how the industry workers and staff interacted with them.
​
-
Since most of the designs I made were for B2B digital solutions, all of them had terms such as meters, voltages, generators, etc., and to define a hierarchy in software, I had to study these. I got to learn a lot about these concepts during the process.
​
-
This internship was highly rewarding since I learned what processes do big MNCs follow while making a product, from its inception to its deployment.