Usage Guide for Data Stores

In addition to accessing data stores using Lua DataStoreService in Studio or live servers, you can use Open Cloud APIs for data stores to access and utilize data stores from external scripts and tools with granular access and security control.


You can improve several areas of your workflow by accessing your data with these APIs, such as:

  • Customer Support Portal: Data stores can be useful for storing persistent user resources, such as items in user inventories or skill points. Sometimes you need to update these resources for customer support. Rather than using Studio or joining an experience every time manually, you can build a web application to allow your customer service agents to directly handle customer support requests, such as viewing and modifying user inventories and issuing refunds.

  • LiveOps Dashboard: Data stores can also be useful for live operations, such as scheduling in-experience events. You can build a LiveOps dashboard, pre-code an event in your experience, hide it under a feature flag as part of the configuration data in a data store, and set a time to flip the flag for publishing the event. Your experience servers can detect this change by reading the flag and then launch the event.

  • Data Migration: Data stores might require changes or updates as your experience evolves, such as upgrading data schemas to accommodate new features. To avoid losing existing user data, you need to migrate your data store from the old schema to a new one. You can write an external script that reads each entry from current data stores, maps the data to the new schema, and writes the entry back to a new data store.

Differences with the Lua API

Although Open Cloud APIs are similar to the Lua DataStoreService, there are a few different requirements to be aware of:

  • Universe ID and data store name: The base URL of all Open Cloud endpoints is{universeId}. Unlike Lua APIs, Open Cloud APIs are stateless and can come from anywhere, so you need to always provide the Universe ID, which is the identifier of your experience that you can copy on Creator Dashboard, and the data store name when sending the requests.

  • Separate permissions for creating and updating: The Lua API creates new entries if they don't exist when you call GlobalDataStore:SetAsync(), but Open Cloud methods for creating and updating entries are separate. Separate permissions can be safer and more flexible in certain situations. For example, you can create a customer support tool that can edit an existing user's profile but can't create a new user's profile.

  • Data serialization: All Open Cloud endpoints require you to serialize all data before network transportation. Serialization means to convert an object into that string, and deserialization is its inverse operation (convert a string to an object). The Lua API serializes and deserializes entry content automatically, but for Open Cloud you need to generate or parse your entry data with JSON on your own.

Security Permissions

Data stores usually store sensitive information, such as user profiles and virtual currency. To maintain security, each Open Cloud API has corresponding required permissions that you must add to your API key, such as the List Keys permission for the listing API. If you don't add the required permissions, your API call returns an error. For the specific permissions that are required for each API, see the API Reference.

When configuring your API keys, you can set granular permissions, such as read, write, and list entry, for each data store within a specific experience, or you can give a key to read or write all data stores within an experience. You can also limit access to a subset of data stores needed for your tool instead of exposing all data stores. This mitigates the impact in case your key gets leaked.

Getting the Universe ID

When sending requests, you need to pass the Universe ID, the identifier of the experience in which you want to access your data stores. You can obtain the Universe ID of an experience with the following steps:

  1. Navigate to the Creations page on the Creator Dashboard.
  2. Find the experience with data stores that you want to access.
  3. Click the ... button on the target experience's thumbnail to display a list of options, then select Copy Universe ID from the list.

Building Tools

You can use the language of your choice to build tools with Open Cloud APIs for data stores to fulfill your operation needs. The following example walks through the process of creating a user inventory support portal in Python.

User Inventory Support Portal

This section provides a concrete example of building a user inventory support portal in Python, in which you can list and read a subset of your users' inventory, make edits, and then update back to an experience's data store.

For this example, assume the following:

  • The name of the data store that stores the user inventory is player_inventory.

  • The name of the API key is Player-inventory-update.

  • The data schema for each data entry is "userId": {"currency": number, "weapon": string, "level": number}. The key is just userId.

  • The Python script lists a subset of user inventories based on prefixes, increase their virtual currency by 10 for a promotion, and update the data.

From a high level, you can build your Python app by adding API key permissions and then adding scripts.

Adding API Key Permissions for Data Stores

When creating an API Key for this example, make sure you perform the following operations:

  1. From the Select API System menu in the Access Permissions section, select Data Stores.

  2. (Optional) In the Data Stores section, select API operations for specific data stores.

    1. Enable the Specific Data Store Operations toggle. By default, five data stores automatically load, but you can add additional data stores through the + Add Data Store to List button.
    2. Select the dropdown arrow next to a data store's name, then select the API operations you want the data store to have access to.
  3. Select API operations for the entire experience.

    1. Click the Select Experience to Add dropdown and select an experience.
    2. In the Experience Operations, click the dropdown arrow and select the operations you want to add to your API. This example selects Read Entry, Update Entry, and List Entry Keys for the entire experience.
  4. In the Security section, explicitly set IP access to the key using CIDR notation, and set an explicit expiration date so your key automatically stops working after that date. For this example, since you will do local testing first, you can remove the IP restriction by setting it to and let it expire in 30 days.

Adding Scripts for the User Inventory Support Portal

After creating the API key with permissions required for the example app, you need to add Python scripts to perform app functionalities. The file shows how to define List Entries, Get Entry, and Increment Entry methods. The update_inventory file uses the defined methods to list a subset of user inventories, increase the virtual currency for each user, and update the data.

import hashlib
import requests
import json
import base64
class DataStores:
def __init__(self):
self._base_url = "{universeId}"
# API Key is saved in an environment variable signified by 'API_KEY'
self._apiKey = str(os.environ['API_KEY'])
self._universeId = "UNIVERSE_ID"
self.ATTR_HDR = 'Roblox-entry-Attributes'
self.USER_ID_HDR = 'Roblox-entry-UserIds'
self._objects_url = self._base_url +self._universeId+'/standard-datastores/datastore/entries/entry'
self._increment_url = self._objects_url + '/increment'
self._version_url = self._objects_url + '/versions/version'
self._list_objects_url = self._base_url +self._universeId+'/standard-datastores/datastore/entries'
def _H(self):
return { 'x-api-key' : self._apiKey }
def _get_url(self, path_format: str):
return f"{self._config['base_url']}/{path_format.format(self._config['universe_id'])}"
return r, attributes, user_ids
def get_entry(self, datastore, object_key, scope = None):
self._objects_url = self._base_url +self._universeId+'/standard-datastores/datastore/entries/entry'
headers = { 'x-api-key' : self._apiKey }
params={"datastoreName" : datastore, "entryKey" : object_key}
if scope:
params["scope"] = scope
r = requests.get(self._objects_url, headers=headers, params=params)
if 'Content-MD5' in r.headers:
expected_checksum = r.headers['Content-MD5']
checksum = base64.b64encode(hashlib.md5(r.content).digest())
#print(f'Expected {expected_checksum}, got {checksum}')
attributes = None
if self.ATTR_HDR in r.headers:
attributes = json.loads(r.headers[self.ATTR_HDR])
user_ids = []
if self.USER_ID_HDR in r.headers:
user_ids = json.loads(r.headers[self.USER_ID_HDR])
return r
def list_entries(self, datastore, scope = None, prefix="", limit=100, allScopes = False, exclusive_start_key=None):
self._objects_url = self._base_url +self._universeId+'/standard-datastores/datastore/entries'
headers = { 'x-api-key' : self._apiKey }
r = requests.get(self._objects_url, headers=headers, params={"datastoreName" : datastore, "scope" : scope, "allScopes" : allScopes, "prefix" : prefix, "limit" : 100, "cursor" : exclusive_start_key})
return r
def increment_entry(self, datastore, object_key, incrementBy, scope = None, attributes=None, user_ids=None):
self._objects_url = self._base_url +self._universeId+'/standard-datastores/datastore/entries/entry/increment'
headers = { 'x-api-key' : self._apiKey, 'Content-Type': 'application/octet-stream' }
params={"datastoreName" : datastore, "entryKey" : object_key, "incrementBy" : incrementBy}
if scope:
params["scope"] = scope
r =, headers=headers, params=params)
attributes = None
if self.ATTR_HDR in r.headers:
attributes = json.loads(r.headers[self.ATTR_HDR])
user_ids = []
if self.USER_ID_HDR in r.headers:
user_ids = json.loads(r.headers[self.USER_ID_HDR])
return r

import tutorialFunctions
DatastoresApi = tutorialFunctions.DataStores()
# Set up
datastoreName = "PlayerInventory"
# List keys for a subset of users (you might need to use the nextPageCursor to view other entries)
keys = DatastoresApi.list_entries(datastoreName)
# Read inventory for each user
for x in range(5):
updatedObjectKey = "User_"+str(x+1)
value = DatastoresApi.get_entry(datastoreName, updatedObjectKey)
# change response type to a string
updatedValue = value.json()
print(updatedObjectKey + " has "+str(updatedValue)+" gems in their inventory")
# Update the currency of each user by 10
for x in range(5):
updatedObjectKey = "User_"+str(x+1)
value = DatastoresApi.increment_entry(datastoreName, updatedObjectKey, 10)
# change response type to a string
updatedValue = value.json()
print(updatedObjectKey + " now has "+str(updatedValue)+" robux in their inventory")

To test, set the API_KEY environment variable and run update_inventory file:

export API_KEY=... \
python update_inventory

Legacy Scope Support

This section is only relevant if you use the legacy Scope feature.

Like Lua DataStoreService, every key in a data store has a default global scope, but you can further organize keys by setting a unique string as a scope that specifies a subfolder for the entry. Once you set a scope, it automatically prepends to all keys in all operations done on the data store.

The scope categorizes your data with a string and a separator with "/", such as:

Key Scope
houses/User_1 houses
pets/User_1 pets
inventory/User_1 inventory

All data store entry operation methods have a Scope parameter for when you need to access the entries stored under a non-default scope. For example, you might have a 1234 key under the default global scope, and the same key under special scope. You can access the former without using the scope parameter, but to access the latter, you have to specify the scope parameter as special in Get Entry or Increment Entry API calls.

Additionally, if you want to enumerate all the keys stored in a data store that has one or multiple non-default scopes, you can set the AllScopes parameter in List Entries method to be true, in which case the call returns a tuple with key string and scope. In the previous example, the List Entries would return both (1234, global), and (1234, special) in the response.

You cannot pass Scope and AllScopes parameters on the same request, otherwise the call returns an error. Leveraging the helping functions from the Open Cloud APIs for data stores module, the following code illustrates how you can read every key in a data store with a custom scope:

List Keys for Different Scopes

# Set up
import tutorialFunctions
DatastoresApi = tutorialFunctions.DataStores()
datastoreName = "PlayerInventory"
# List keys for global scope
specialScopeKeys = DatastoresApi.list_entries(datastoreName, scope = "global", allScopes = False)
# List keys for special scope
specialScopeKeys = DatastoresApi.list_entries(datastoreName, scope = "special", allScopes = False)
# List keys for allScope set to true
specialScopeKeys = DatastoresApi.list_entries(datastoreName, allScopes = True)

Keys with the corresponding scope are returned in the response:

Example Responses for Different Scopes

// Response for global scope
{ "keys": [{ "scope": "global", "key": "User_2" }], "nextPageCursor": "" }
// Response for special scope
// Response for AllScopes