Next Level Web Apps: Customer Telemetry

Justin Baird, Director of Application Development, CubeSmart
Justin Baird, Director of Application Development, CubeSmart

Justin Baird, Director of Application Development, CubeSmart

In web applications today, thousands of telemetry data points are pulled into data warehousing systems and processed to give the company insight on how Customers are using the app. Organizations rely heavily on the offerings of Google, Microsoft, Adobe, and others to provide page-level insights on how a customer travels through applications. They also provide information about how optimal a user’s experience was using the app. These page-level metrics such as time-on-page, user flow, and session duration is aggregated and married to geo and demographic data. They are the general indicators of the effectiveness and the design of the app. Collecting this application telemetry data comes at a cost, as every data point generated must be processed and stored. If not properly done, the performance is severely impacted. The analytics and monitoring companies have been very successful in balancing the data provided for server and application performance, verses the impact on the customer experience. As the demand for information becomes more sophisticated, the required granularity of data becomes smaller. The next level of granularity for customer data is Customer Telemetry.

"Customer Telemetry consists of recording nearly every customer movement and interaction from within the page"

Customer Telemetry consists of recording nearly every customer movement and interaction from within the page. Gathering information every time a user hovers over an image, clicks into a control, and even time spent in various Asynchronous JavaScript and XML (AJAX) controls within a page, will allow the most granular insights of Customer behavior. The volume of data generated by capturing this level of detail could be upwards of tens of thousands of data points per Customer, dwarfing all other types of telemetry data.

Imagine the ability to recreate a Customer’s interaction with an app down to the mouse movements. This enhanced level of detail allows companies to segment the user base, and identify behavioral patterns on demand, instead of analyzing delayed data from a focus group. This agility could enable quick identification and optimization of problematic user experience elements or enhance application flows. Collecting tens of thousands of data points per user, without negatively affecting application performance, is now achievable by leveraging commodity web server hardware and service offerings from many cloud providers.

The data acquisition process involves implementation of code tracking on every element on the page's mouse-in and mouse-out events. By recording the exact timestamp and control (as well as any other pertinent data) on those events, a UX designer or support person would be able to recreate a Customer’s usage pattern on demand. Unless properly implemented, however, generating this volume of data could cripple the performance of the application, and more likely do more harm than good. The proposed technology stack is Operating System, platform, provider, and language agnostic running on commodity hardware. The only specialized component necessary would be the data warehousing system of choice and the necessary supporting equipment.

Page Level

On the page level, each element would have events executed any time a user interacts with that element. These include events such as, mouse over, hover, click, as well as entry into text boxes and combo boxes. Each of these events would spawn a short-lived background thread. By capturing the user data into a background thread, the User Experience (UX) thread is free to respond with minimal interruption.

Background Thread

As each event fires, a background thread that exists solely to take the telemetry data from the UX thread and record it into the local in-memory cache on the web server. After the data is in the cache, the thread is recycled.

Web Server Worker

Each front-end web server would have a worker installed on it that would package up telemetry data from the local in-memory cache, and forward that into messages in a Queue. Queue offerings by any major Infrastructure as a Service (IaaS) company can handle hundreds of thousands of messages at a time.


Worker applications monitor the Queue for new messages and can automatically scale based upon current Queue backlog. Worker applications would take the log data, perform any transforms required, and load the data into the data warehouse.

The primary goal of the workers is to insert information into the data warehouse with the highest throughput per second as possible. Any inefficiency in the worker applications will increase costs, as more worker applications needs to process the queue messages.

Data Warehouse

The Data Warehousing system would be the ultimate recipient of the telemetry data, and would facilitate the reporting based upon that data. Depending on the implementation, it may be wise to place the incoming telemetry data into a holding area, optimized for speed of insert, and then incorporated into the warehouse proper.

After capturing Customer Telemetry data points, it is added to existing data from other providers to gain a much greater insight into each customer’s behavior. Just as the Google Analytics, helps identify friction points between pages in the funnel, Customer Telemetry will help identify points within a page that may cause customers to bounce from the site. The cost reduces by using telemetry data and user experience focus groups. The actual interactions can serve as a guide to future user experience enhancements.

Read Also

Technology and CEM - A Blessing or a Curse?

Technology and CEM - A Blessing or a Curse?

Ben Saitz, CCO, Rocket Fuel
Forget Acquisition; Win Your Current Customers First

Forget Acquisition; Win Your Current Customers First

Charles Rosenblatt, CCO, Hyperwallet Systems