Ruby Source

Our Ruby library lets you record analytics data from your ruby code. The requests hit our servers, and then we route your data to any analytics service you enable on your destinations page.

This library is open-source, so you can check it out on Github.

All of our server-side libraries are built for high-performance, so you can use them in your web server controller code. This library uses an internal queue to make identify and track calls non-blocking and fast. It also batches messages and flushes asynchronously to our servers.

Want to stay updated on releases? Subscribe to the release feed.

Getting Started

Install the Gem

If you’re using bundler, add the following line to your project’s Gemfile:

gem 'analytics-ruby', '~> 2.0.0', :require => 'segment/analytics'

Or, if you’re using the gem directly from your application, you’ll need to:

gem install analytics-ruby

Then initialize the gem with your Segment source’s Write Key and an optional error handler, like so:

require 'segment/analytics'

Analytics ={
    write_key: 'YOUR_WRITE_KEY',
    on_error: { |status, msg| print msg }

That will create an instance of Analytics that you can use to send data to Segment for your source.

If you’re using Rails, you can stick that initialization logic in config/initializers/analytics_ruby.rb and omit the require call.

Note: Our ruby gem makes requests asynchronously, which can sometimes be suboptimal and difficult to debug if you’re pairing it with a queuing system like Sidekiq/delayed job/sucker punch/resqueue. If you’d prefer to use a gem that makes requests synchronously, you can check out simple_segment, an API-compatible drop-in replacement for the standard gem that does its work synchronously inline. Big thanks to Mikhail Topolskiy for his stewardship of this alternative gem!


The identify method is how you associate your users and their actions to a recognizable userId and traits. You can find details on the identify method payload in our Spec.

The identify call has the following fields:

user_idStringID for this user in your database
anonymous_id, optionalStringThe ID associated with the user when you don’t know who they are (either user_id or anonymous_id must be given)
integrations, optionalHashA Hash specifying which destinations this should be sent to.
traits optionalHashA Hash of traits you know about the user. Things like: email, name or friends.
timestamp optionalTimeA Time object representing when the identify took place. This is most useful if you’re importing historical data. If the identify just happened, leave it blank and we’ll use the server’s time.
context optionalHashA Hash that can include things like user_agent or ip.

Example identify:

    user_id: '019mr8mf4r',
    traits: { email: "#{ }", friends: 872 },
    context: {ip: ''})

This example call will identify your user by their unique User ID (the one you know him by in your database) and label them with email and friends traits.


The track method lets you record any actions your users perform. You can find details on the track method payload.

The track call has the following fields:

user_idStringThe ID for this user in your database.
eventStringThe name of the event you’re tracking. We recommend human-readable names like Song Played or Status Updated.
properties optionalHashA Hash of properties for the event. If the event was Product Added to their cart, it might have properties like price or product.
timestamp optionalTimeA Time representing when the event took place. If the track just happened, leave it out and we’ll use the server’s time. If you’re importing data from the past make sure you send a timestamp.
context optionalHashA Hash that can include things like user_agent or ip.
anonymous_id optionalStringThe ID associated with the user when you don’t know who they are (either user_id or anonymous_id must be given).
integrations optionalHashA Hash specifying which destinations this should be sent to.

You’ll want to track events that are indicators of success for your site, like Signed Up, Item Purchased or Article Bookmarked.

To get started, we recommend tracking just a few important events. You can always add more later!

Example track call:

    user_id: '019mr8mf4r',
    event: 'Item Purchased',
    properties: { revenue: 39.95, shipping: '2-day' })

This example track call tells us that your user just triggered the Item Purchased event with a revenue of $39.95 and chose your hypothetical ‘2-day’ shipping.

track event properties can be anything you want to record, for example:

    user_id: 'f4ca124298',
    event: 'Article Bookmarked',
    properties: {
      title: 'Snow Fall',
      subtitle: 'The Avalance at Tunnel Creek',
      author: 'John Branch'

For more information about choosing which events to track, event naming and more, check out Analytics Academy


The page method lets you record page views on your website, along with optional extra information about the page being viewed.

If you’re using our client-side setup in combination with the Ruby library, page calls are already tracked for you by default. However, if you want to record your own page views manually and aren’t using our client-side library, read on!

The page call has the following fields:

user_idStringThe ID for this user in your database.
category optionalStringThe category of the page. Useful for things like ecommerce where many pages might live under a larger category. Note: if you only pass one string to page we assume it’s a name, not a category. You must include a name if you want to send a category.
name optionalStringThe name of the of the page, for example Signup or Home.
properties optionalHashA hash of properties of the page.
anonymous_id optionalStringIf you want to track users anonymously, you can include the Anonymous ID instead of a User ID
context optionalHashAn object containing any number of options or context about the request. To see the full reference of supported keys, check them out in the context reference

Example page call:
  user_id: user_id,
  category: 'Docs',
  name: 'Ruby library',
  properties: { url: '' })

Find details on the page payload in our Spec.


The group method associates an identified user with a company, organization, project, workspace, team, tribe, platoon, assemblage, cluster, troop, gang, party, society or any other name you came up with for the same concept.

This is useful for tools like Intercom, Preact and Totango, as it ties the user to a group of other users.

The group call has the following fields:

user_idStringThe ID for the user that is a part of the group.
group_idStringThe ID of the group.
traits optionalHashA hash of traits you know about the group. For a company, they might be things like name, address, or phone.
context optionalHashA hash containing any context about the request. To see the full reference of supported keys, check them out in the context reference
timestamp optionalTimeA Time object representing when the group took place. If the group just happened, leave it out and we’ll use the server’s time. If you’re importing data from the past make sure you send timestamp.
anonymous_id optionalStringAn anonymous session ID for this user.
integrations optionalHashA hash of destinations to enable or disable

Example group call:
  user_id: '019mr8mf4r',
  group_id: '56',
  traits: { name: 'Initech', description: 'Accounting Software'})

Find more details about group including the group payload in our Spec.


alias is how you associate one identity with another. This is an advanced method, but it is required to manage user identities successfully in some of our destinations.

In Mixpanel it’s used to associate an anonymous user with an identified user once they sign up. For KISSmetrics, if your user switches IDs, you can use ‘alias’ to rename the ‘userId’.

alias method definition:

Analytics.alias(previous_id: 'previous id', user_id: 'new id')

The alias call has the following fields:

userIdStringThe ID for this user in your database.
previousIdStringThe previous ID to alias from.

Here’s a full example of how we might use the alias call:

# the anonymous user does actions ...
Analytics.track(user_id: 'anonymous_user', event: 'Anonymous Event')
# the anonymous user signs up and is aliased
Analytics.alias(previous_id: 'anonymous id', user_id: 'user id')
# the identified user is identified
Analytics.identify(user_id: 'user id', traits: { plan: 'Free' })
# the identified user does actions ...
Analytics.track(user_id: 'user id', event: 'Identified Action')

For more details about alias, including the alias call payload, check out our Spec.

Historical Import

You can import historical data by adding the timestamp argument to any of your method calls. This can be helpful if you’ve just switched to Segment.

Historical imports can only be done into destinations that can accept historical timestamp’ed data. Most analytics tools like Mixpanel, Amplitude, Kissmetrics, etc. can handle that type of data just fine. One common destination that does not accept historical data is Google Analytics since their API cannot accept historical data.

Note: If you’re tracking things that are happening right now, leave out the timestamp and our servers will timestamp the requests for you.

Selecting Destinations

The alias, group, identify, page and track calls can all be passed an object of integrations that lets you turn certain destinations on or off. By default all destinations are enabled.

Here’s an example track call with the integrations object shown.

  user_id: '83489',
  event: 'Song Paused',
  integrations: { All: false, KISSmetrics: true }

In this case, we’re specifying that we want this identify to only go to KISSmetrics. all: false says that no destination should be enabled unless otherwise specified. KISSmetrics: true turns on KISSmetrics, etc.

Destination flags are case sensitive and match the destination’s name in the docs (i.e. “AdLearn Open Platform”, “”, “MailChimp”, etc.).


  • Available at the business level, filtering track calls can be done right from the Segment UI on your source schema page. We recommend using the UI if possible since it’s a much simpler way of managing your filters and can be updated with no code changes on your side.

  • If you are on a grandfathered plan, events sent server-side that are filtered through the Segment dashboard will still count towards your API usage.


Our libraries are built to support high performance environments. That means it is safe to use analytics-ruby on a web server that’s serving hundreds of requests per second.

Every method you call does not result in an HTTP request, but is queued in memory instead. Messages are flushed in batch in the background, which allows for much faster operation.

By default, our library will flush:

  • the very first time it gets a message
  • whenever messages are queued and there is no outstanding request

There is a maximum of 500kb per batch request and 15kb per call.

The queue consumer makes only a single outbound request at a time to avoid saturating your server’s resources. If multiple messages are in the queue, they are sent together in a batch call.

You can specify the following additional options to determine how the queue operates and to help debug possible errors. None of them are required for normal operation.

# Error handler to log statements{
  write_key: 'YOUR_WRITE_KEY',
  on_error: { |status, msg| print msg },
  max_queue_size: 10000,
  batch_size: 100,
  stub: true
on_error optionalProcA handler which is called whenever errors are returned from the API. Useful for debugging and first time destinations.
max_queue_size optionalFixNumThe max number of messages to put in the queue before refusing to queue more (defaults to 10,000).
batch_size optionalFixNumThe max number of events/identifies to send in a single batch (defaults to 100). The API servers will not respond to messages over a certain size, so 100 is a safe default.
stub optionalTrueClass|FalseClassIf true, the requests don’t hit the server and are stubbed to be successful (defaults to false).


If you’re running any sort of script or internal queue system to upload data, you’ll want to call Analytics.flush at the end of execution to ensure that all messages are sent to our servers. It’s also recommended you call this method on shutdown to ensure all queued messages are uploaded to Segment.

AppAnalytics ={
  write_key: 'ONE_WRITE_KEY'

Calling flush will block execution until all messages are processed, so it is not recommended in normal execution of your production application.

If you’re using Ruby on Rails with the Turbolinks setting enabled, and you’re adding Analytics.js on your website, you’ll need to tweak the default configuration.

Instead of having the entire snippet in the <head> of your site, you need to move the call that is included in the snippet by default into the <body> so that it will get triggered on every new page load. But you must have the first part of the snippet in the <head> or the library will fail to load properly.


The Ruby library will automatically handle serializating your data into JSON for our servers. It uses JSON.generate under the hood. Note that BigDecimal values are intentionally sent as Strings rather than floats so that our Node servers don’t lose precision. If you’d prefer to use a float, you can coerce values to a float before sending the data to Segment.

Multiple Clients

Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of Analytics with different settings:

AppAnalytics ={
  write_key: 'ONE_WRITE_KEY'

MarketingAnalytics ={
  write_key: 'ANOTHER_WRITE_KEY'


If you’re having trouble we have a few tips that help common problems.

No events in my debugger

  1. Double check that you’ve followed all the steps in the Quickstart.

  2. Make sure that you’re calling one of our API methods once the library is successfully installed—identify, track, etc.

No events in my end tools

  1. Double check your credentials for that destination.

  2. Make sure that the destination you are troubleshooting can accept server-side API calls. Compatibility is shown on the destination docs pages and on the sheets on your Segment source Destinations page.

  3. Check out the destination’s documentation to see if there are other requirements for using the method and destination you’re trying to get working.

If you have any questions or see anywhere we can improve our documentation, please let us know or kick off a conversation in the Segment Community!