By Tomi Fadipe | March 5, 2025

AI is Showing Up in Rugged Tablets in New Ways. But This May Be One of the Most Meaningful on the Frontline Right Now.

Automatic touch sensing and glove/water/stylus mode switching may not seem like a big deal to you. But it is for people who must constantly switch between these modes to do their jobs.  

What can AI do for you? 

Right now. 

Without any training or extra “work” needed on your part.

If you’re an emergency medical technician (EMT,) utility lineman, forklift operator, delivery driver, or other frontline worker who can’t do much without your rugged tablet, there’s one thing that comes to mind:

AI can reduce button pushes. 

What do I mean? 

If you’ve used a rugged tablet anytime in the last 25 years, you’ll appreciate how frustrating it can be to have to manually toggle the touch screen mode between “bare finger” and “gloved finger” mode every time you put your gloves on or take them off. Or between wet mode and dry mode when you’re in the rain one minute and in your vehicle the next.

If only the tablet was smart enough to sense whatever was touching the screen and automatically toggle between the appropriate modes, right?

Well, some rugged tablets are now able to figure this out on their own thanks to AI and, specifically, a new AI chipset from SigmaSense that works with mobile devices (like these Windows tablets.) 

That means you can go about your day, using your tablet however you need to, without having to prime the tablet to capture a signature in the rain or submit a handwritten report. You won’t need to switch any touchscreen settings each time you put on gloves or take them off anymore, either.  You can simply grab your tablet and get to work without giving the settings a second thought. 

Wondering why it took so long for this to be possible?! 

Well, believe it or not, it seems like a simple-enough feature to engineer into decades-old mobile technology, but it’s not really up to device engineers. This friction point wasn’t fixable until AI chips capable of low-power processing matured enough to be trusted in these high-use frontline technology tools. 

I recently caught up with Larry Stone from Zebra and Kevin Reinis from SigmaSense for the backstory (or birth story) of this new AI capability in rugged tablets. They explained a little more…

  • why it took decades to make touchscreen autosensing possible on mobile devices.

  • why the Zebra ET6x Windows rugged tablets were the first mobile devices in the world to get this AI assist.

  • how the AI-facilitated autosensing/auto switching work.  

  • why this new AI capability may be among the most meaningful at this moment for frontline workers who spend most of their day out in the elements or inside freezers. 

Hear what they shared then tell me your thoughts. Is this going to make your job (or your team’s job) a little easier?

Topics
Podcast, Tablets, AI, Podcast, Field Operations, Retail, Transportation and Logistics, Warehouse and Distribution, Software Tools,

Zebra Developer Blog
Zebra Developer Blog

Are you a Zebra Developer? Find more technical discussions on our Developer Portal blog.

Zebra Story Hub
Zebra Story Hub

Looking for more expert insights? Visit the Zebra Story Hub for more interviews, news, and industry trend analysis.

Search the Blog
Search the Blog

Use the below link to search all of our blog posts.