AI Inference for VLLM models with F5 BIG-IP & Red Hat OpenShift
Introduction to AI inferencing
AI inferencing is the stage where a pre-trained AI model uses its learned knowledge to make predictions, decisions, or generate outputs. AI inference requests are se...
Updated Jan 27, 2026
Version 2.0Ulises_Alonso
Employee
Solutions architect in Business Development with focus in automation and integration with partner's technologies. Prior to this role I was consultant in Professional Services and escalations engineer in Technical Support. Outside F5, I worked in mobile and wired network operators as network engineer. I started my career in academic research. In all these years, no matter what I've been doing Linux has been always the best tool.
Not the one in the picture :-)lijie
Nimbostratus
Feb 05, 2026dear expert, github link can't access, could you pls update link? tks
Help guide the future of your DevCentral Community!
What tools do you use to collaborate? (1min - anonymous)