Multi-cloud job processing
Deploys Docker-based jobs on cloud, public, private, or on-premises clouds
Runs tasks in unique Docker containers so developers don’t have to manage infrastructure and supports custom stacks so developers can write language-specific works
Supports microcontainers so developers can run applications without increasing runtime
Isolates Ruby, Python, PHP, Java, or Node js code in “IronWorker” to process and schedule parallel jobs that run in isolated Docker sandbox
Integrates with “IronMQ” to queue messages and dispatch and process jobs at different times
Stores share state and pass data in “IronCache,” and automates activity while minimizing database load
How Iron.io works
Iron.io is a serverless app platform. It consists of IronWorker (a job processor and container manager), IronMQ (a distributed queue service), and IronCache (a fast data service). IronMQ is the messaging layer that mediates data ingestion. It can send and receive data from many sources via either provided code libraries, or custom code implementations.
Get more out of Iron.io with Segment
When you toggle on Iron.io in Segment, we’ll start sending data to an IronMQ instance with data for your account. Iron.io supports all of the Segment methods, and will send data from any one of the Segment libraries. This means that you can fully instrument your data collection for Iron.io just with Segment alone.
When sending data to Iron.io, we’ll auto-fill a queue called “segment”. You can then use Iron.io as a message queue buffer in front of your webhook server, or internal data processing cluster. For example, if you want to analyze your data as part of an ETL process, Iron.io can act as an intermediary buffer.