Laboratory automation failures due to misaligned plates, faulty seals, and incorrect consumable placement cost labs thousands of dollars in wasted samples, reagents and time. What if your automation system could simply... adapt?
We're excited to share a deep dive into our latest FlexPod™ configuration at SLAS2026, featuring advanced Perception capabilities that represent the next evolution in intelligent laboratory automation. This demonstration highlights how vision-guided robots and system cameras can transform routine lab operations into more reliable, flexible, and efficient workflows.
Our SLAS2026 demonstration includes showcasing a configuration that reflects real-world deployments while highlighting advanced quality control capabilities (Fig. 1; Fig. 2):
This reader-feeder setup mirrors the most common customer use cases: plates are pulled from storage, processed through dispensing and reading operations, then returned to incubation – a workflow familiar to many research labs.
Our FlexPod Perception system brings a new level of intelligence to laboratory automation through three key capabilities:
FlexPod's Perception and vision system automatically teaches instrument nest positions, dramatically reducing setup time when deploying and redeploying automation across different applications. The vision-enabled robot scans codes to update and verify teach points, ensuring accurate positioning without manual intervention. This capability means you can quickly reconfigure the system while significantly decreasing setup time. For selected instruments, in this case the Plateloc, the vision robot will scan the code with each move to ensure accurate picking and placing each time, further reducing human variability and potential errors.
Combined with FlexPod's cost-effective design, this makes it an ideal entry-level solution for laboratories looking to automate simple workflows or prove the value of automation before committing to larger systems. Use it to demonstrate ROI on target workflows, then scale your automation strategy based on real results.
Hardware Agnostic Flexibility – The Perception solution isn't locked to specific cameras or sensors. It works with various system or robot cameras, so users have ultimate control and flexibility with their automation solutions.
The Perception demo showcases the automated quality control elements that set this system apart:
Automated Lid Management and Verification – Using LidValet for de-lidding, the system includes inline quality checks for lidding and de-lidding of plates coming from storage. Vision systems verify lid presence and removal in real-time and can detect unexpected conditions such as misaligned labware or labware in destination nests, all without needing to explicitly design those steps into your protocol. This intelligence means fewer errors, stopped protocols, and more productive lab time.
The Perception demo system streamlines workflows by automatically initiating Cellario protocol orders when predefined parameters and conditions are met. These triggering criteria are configured and validated within the Perception model to ensure accuracy and reliability.
To maintain user oversight, the system incorporates built-in safeguards: an audio alert announces when an order is about to execute, providing key details and allowing users a brief window to review or cancel the process if needed. This balance of automation and human verification helps prevent unintended actions while maximizing efficiency.
Perception truly enables intelligent automation that dynamically makes decisions based on camera feedback without requiring user intervention to begin a protocol and throughout your workflow.
The demonstration also features a handoff nest for AutoPod integration, showcasing the potential for autonomous mobile robot (AMR) integration into FlexPod workflows. This capability is a building block for creating connected laboratory environments where workflows seamlessly span multiple systems and workstations.
By enabling AutoPod-based plate transport, FlexPod can become part of a larger orchestrated workflow – moving samples between instruments, storage systems, and analysis stations without manual intervention. This connected approach transforms isolated automation islands into an integrated laboratory ecosystem, maximizing efficiency, and enabling more complex, multi-step protocols that would be impractical with manual sample transfer.
Beyond the main protocol run, we'll have a separate demonstration fixture showing Perception model capabilities including labware detection and verification, object detection, and model training processes. This gives visitors insight into how Perception learns and adapts to new labware and protocols, with future integration planned for Cellario.
Laboratory automation has traditionally required perfect conditions and rigid protocols. The integration of Perception and Vision-guided robots and system cameras represents a fundamental shift toward systems that can adapt, learn, and recover – more like skilled lab technicians and less like inflexible machines.
Whether you're running high-throughput screening campaigns, managing complex cell culture workflows, or conducting assay development, the FlexPod with advanced Perception capabilities delivers:
We invite you to experience these capabilities firsthand at SLAS2026. See how Perception-enabled automation can transform your laboratory workflows from rigid and fragile to flexible and resilient.
The future of lab automation isn't just about speed; it's about intelligence, adaptability, and reliability. Come see how FlexPod with Perception delivers all three or tell us about your project now!