TAGS: | | | |

HPE Adds Generative AI To Aruba Networking Central

Drew Conry-Murray

HPE is adding generative AI capabilities to the AI Search feature in HPE Aruba Networking Central, its cloud-based management platform for Aruba gear. HPE says it trained multiple Large Language Models (LLMs) to serve as the backbone of the AI Search feature.

The goal is to help engineers and admins query the search feature using natural language and get more accurate responses, including document summaries, operational diagnoses, and instructions for completing tasks. HPE is replacing its Natural Language Processing (NLP) function in AI Search with the LLMs.

The LLMs are trained on years’ worth of data gathered by Aruba Networking Central. HPE says it has a data lake stocked with telemetry from nearly 4 million wireless and wired network devices, and more than billion unique end points and clients connected to those devices. The company also trained the LLMs on 20,000 pages of HPE documentation and actual customer queries entered into AI Search.

HPE generative AI search screen

Source: HPE blog

 
“AI Search has been around for two years, and our data lake has captured 3 million Aruba-specific network questions,” said Alan Ni, Sr. Director, Edge Marketing, HPE Aruba Networking, in an interview. “We’ve used that data to train a broad set of models to give us more accurate responses.”

“We’ve seen a material improvement of search by going with LLMs instead of NLP,” said Ni, noting that the LLMs do a better job than NLP of interpreting a user’s intent.

HPE declined to say which models it uses.

My Own Private AIdaho

HPE says its models have been trained internally and are hosted in the HPE GreenLake Cloud Platform. Customer queries are not being sent to public LLMs such as ChatGPT.

HPE claims that running these models internally speeds up response times. It might take seconds to get a response to a query in a public model. HPE says its internal LLMs will return results in less than a second.

HPE developed its models to include pre-processing features and guardrails to ensure more accurate answers, and maintain privacy. HPE says one of its models is trained to remove Personally Identifiable Information (PII) and Customer Identifiable Information (CII) from responses. There’s also less risk of the models being trained on copyrighted or proprietary sources outside of HPE’s own.

The company says it has tied role-based controls to AI Search so that an engineer with a restricted role or rights can’t get the system to execute commands outside of those rights (i.e. reboot an AP) or extend their privileges.

LLMs For Nothing And Your Queries For Free

HPE says the LLM-powered AI Search feature will not require a separate license in Aruba Networking Central nor come at an additional cost. There are no limits on queries or clicks in either the Foundation or Advanced tiers of the cloud service.

AI Silos?

This January, HPE announced it was acquiring Juniper Networks for $14 billion. This put two popular wireless products under one company’s roof. Juniper’s Mist wireless product has been a leader in incorporating AI and ML into network operations. Mist’s AI capabilities have been a compelling market differentiator.

At the time of the acquisition, Juniper CEO Rami Rahim said the plan was to “gradually and thoughtfully merge the portfolios.” To my mind, that now seems less likely given that HPE is bolstering Aruba’s investment in AI.

And frankly, having separate AI systems for each product makes sense. It wouldn’t help an Aruba customer to use an LLM that had been trained on Mist documentation and vice-versa.

When I asked about what this GenAI announcement might mean in regard to the Juniper acquisition, Ni said “We’re two companies with significant install bases. We at Aruba have been on this track with AIOps and ML. We think we have a unique approach, particularly to how we’ve implemented GenAI and how we’re hosting and training it internally.”

My takeaway from this response is it seems most likely HPE will continue to offer separate wireless products products for the foreseeable future.

About Drew Conry-Murray: Drew Conry-Murray has been writing about information technology for more than 15 years, with an emphasis on networking, security, and cloud. He's co-host of The Network Break podcast and a Tech Field Day delegate. He loves real tea and virtual donuts, and is delighted that his job lets him talk with so many smart, passionate people. He writes novels in his spare time.

Leave a Comment