INTRODUCTION
Data pipeline gets the data set ready for analysis for software applications. One of the primary functions of a data pipeline is to integrate diverse sources of data into a unified system. This means ensuring that users can easily access and interact with the integrated data seamlessly.
MY ROLE
Research, analysis, product design, user testing
TEAM
1 Designer, 1 product manager,
1 technical architect
TIMELINE
June 2021- Feb 2022
OVERVIEW OF THE REDESIGNED EXPERIENCE
OVERVIEW OF A CONFIGURED PIPELINE
DETAILS OF APPLIED DATA TRANSFORMATION TYPE
LIST VIEW OF EXISTING PIPELINES
LIST VIEW OF DATA ATTRIBUTES AND DATA PREVIEW
PREVIEW OF FINAL DATA SET
PROBLEM STATEMENT
Code based data pipeline configuration and setup tool used by Honeywell gives less flexibility to its customers. Customers have to reach out to Honeywell admin team each time they need to make changes in an existing data pipeline. The idea is that users with no coding experience can update a data pipeline as per their new business requirements.
BREAKDOWN OF THE PROBLEM
01 Dependency on Honeywell admin team for editing an existing data pipeline
03 No error handling mechanism if a configuration fails to implement
02 Complicated workflow for making minor changes in the pipeline increases time taken to complete a task
THE USERS OF DATA PIPELINE
Configuration implementation specialist or a config. implementor install, configure and troubleshoot software products and help customers understand how the technology functions. They are sent directly by the companies whose software was purchased.
This role is taken up by someone at customer's end to manage their product configurations as per their business requirements.
OVERVIEW OF THE OLD EXPERIENCE
Code based data pipeline configuration tool
PROCESS
Research
My First Step was to understand the concept of data pipeline and how it can help customers in achieving their goals within the context of Honeywell cloud offerings.
I held multiple working sessions with the PM to determine feature requirements, outline user journeys, and create workflows.
Each business vertical in Honeywell has its own method for handling and preparing datasets for visualization. Standardising the workflow would save engineering effort while offering customers greater flexibility and scalability.
PROCESS
Identifying and prioritising user jobs (experience outcomes)
A comprehensive list of user jobs across personas were framed and stack ranked in order to understand users expectations to achieve a particular outcome through the product. This exercise laid the foundation of MVP.
PROCESS
Competitive analysis of other enterprise tools were done to understand how they prepare and manage final dataset to be consumed by end application and to also understand industry best practices.
PROCESS
Capturing the detailed journey and workflow of a config implementor helped in understanding the steps involved in setting up a pipeline and 'feature set' required to successfully complete every step of the configuration.
SOLUTIONS
Wireframes explorations from basic layout to screens with detailed content, which helped in identifying the best representation of data pipeline.
SOLUTIONS