You can bulk import user data into Auth0 using the Create Import Users Job endpoint. Bulk imports are useful for migrating users from an existing database or service to Auth0.
If you try to use more than one migration method (for example, automatic migration then bulk user import), you may encounter a DUPLICATED_USER error. This error indicates that the user exists in Auth0’s internal user store but not in your tenant. To correct this error, delete the user with the Auth0 Management API Delete a Connection User endpoint and then re-attempt the import.

Prerequisites

Before you launch the import users job:
  • Configure a database connection to import the users into and enable it for at least one application.
  • If you are importing passwords, make sure the passwords are hashed using one of the supported algorithms. Users with passwords hashed by unsupported algorithms will need to reset their password when they log in for the first time after the bulk import.
  • If you are importing enrollments, make sure they are a supported type: email, phone, or totp.
  • Get a Management API token for job endpoint requests.
If you are using an export file from an Auth0 tenant, you must convert the exported file from ndjson to JSON. To keep the same user IDs, you must remove the auth0| prefix from all imported user IDs.The import process automatically adds the auth0| prefix to the imported user IDs. If you do not remove the auth0| prefix before importing, the user IDs return as auth0|auth0|...

Create users JSON file

Create a JSON file with the user data you want to import into Auth0. How you export user data to a JSON file will vary depending on your existing user database. The endpoint expects sections of the JSON file. So instead of using fs.readFileSync, it requires fs.createReadStream. The endpoint expects a piped read stream instead of the whole JSON file. To learn more about the JSON file schema and see examples, read Bulk Import Database Schema and Examples.
The file size limit for a bulk import is 500KB. You will need to start multiple imports if your data exceeds this size.

Request bulk user import

To start a bulk user import job, make a POST request to the Create Import Users Job endpoint. Be sure to replace the MGMT_API_ACCESS_TOKEN, USERS_IMPORT_FILE.json, CONNECTION_ID, and EXTERNAL_ID placeholder values with your Management API , users JSON file, database connection ID, and external ID, respectively.
curl --request POST \
  --url 'https://{yourDomain}/api/v2/jobs/users-imports' \
  --header 'authorization: Bearer MGMT_API_ACCESS_TOKEN' \
  --form users=@USERS_IMPORT_FILE.json \
  --form connection_id=CONNECTION_ID \
  --form external_id=EXTERNAL_ID
ParameterDescription
usersFile in JSON format that contains the users to import.
connection_idID of the connection to which users will be inserted. You can retrieve the ID using the GET /api/v2/connections endpoint.
upsertBoolean value; false by default. When set to false, pre-existing users that match on email address, user ID, phone, or username will fail. When set to true, pre-existing users that match on email address will be updated, but only with upsertable attributes. For a list of user profile fields that can be upserted during import, see User Profile Structure: User profile attributes. Note: Providing a duplicated user entry in the import file will cause an error. In this case, Auth0 will not do an insert followed by an update.
external_idOptional user-defined string that can be used to correlate multiple jobs. Returned as part of the job status response.
send_completion_emailBoolean value; true by default. When set to true, sends a completion email to all tenant owners when the import job is finished. If you do not want emails sent, you must explicitly set this parameter to false.
If the request is successful, you’ll receive a response similar to the following:
{
  "status": "pending",
  "type": "users_import",
  "created_at": "",
  "id": "job_abc123",
  "connection_id": "CONNECTION_ID",
  "upsert": false,
  "external_id": "EXTERNAL_ID",
  "send_completion_email": true
}
The returned entity represents the import job. When the user import job finishes and if send_completion_email was set to true, the tenant administrator(s) will get an email notifying them that job either failed or succeeded. An email for a job that failed might notify the administrator(s) that it failed to parse the users JSON file when importing users.

Concurrent import jobs

The Create Import Users Job endpoint has a limit of two concurrent import jobs. Requesting additional jobs while there are two pending returns a 429 Too Many Requests response:
{
  "statusCode": 429,
  "error": "Too Many Requests",
  "message": "There are 2 active import users jobs, please wait until some of them are finished and try again
}

Check job status

To check a job’s status, make a GET request to the Get a Job endpoint. Be sure to replace the MGMT_API_ACCESS_TOKEN and JOB_ID placeholder values with your Management API Access Token and user import job ID.
curl --request GET \
  --url 'https://{yourDomain}/api/v2/jobs/JOB_ID' \
  --header 'authorization: Bearer MGMT_API_ACCESS_TOKEN' \
  --header 'content-type: application/json'
Depending on the status of the user import job, you’ll receive a response similar to one of the following: Pending
{
  "status": "pending",
  "type": "users_import",
  "created_at": "",
  "id": "job_abc123",
  "connection_id": "CONNECTION_ID",
  "external_id": "EXTERNAL_ID"
}
Completed If a job is completed, the job status response will include totals of successful, failed, inserted, and updated records.
{
  "status": "completed",
  "type": "users_import",
  "created_at": "",
  "id": "job_abc123",
  "connection_id": "CONNECTION_ID",
  "external_id": "EXTERNAL_ID",
  "summary": {
    "failed": 0,
    "updated": 0,
    "inserted": 1,
    "total": 1
  }
}
Failed If there is an error in the job, it will return as failed. However, note that invalid user information, such as an invalid email, will not make the entire job fail.
{
  "status": "failed",
  "type": "users_import",
  "created_at": "",
  "id": "job_abc123",
  "connection_id": "CONNECTION_ID",
  "external_id": "EXTERNAL_ID",
}
To learn details for failed entries see Retrieve failed entries below.

Job timeouts

All user import jobs timeout after two (2) hours. If your job does not complete within this time frame, it is marked as failed. Furthermore, all of your job-related data is automatically deleted after 24 hours and cannot be accessed afterward. As such, we strongly recommend storing job results using the storage mechanism of your choice.

Retrieve failed entries

All of the job-related data is automatically deleted after 24 hours and cannot be accessed thereafter. Because of this, we strongly recommend storing the job results using the storage mechanism of your choice.
If there were errors in the user import job, you can get the error details by making a GET request to the Get Job Error Details endpoint. Be sure to replace the MGMT_API_ACCESS_TOKEN and JOB_ID placeholder values with your Management API Access Token and user import job ID.
curl --request GET \
  --url 'https://{yourDomain}/api/v2/jobs/JOB_ID/errors' \
  --header 'authorization: Bearer MGMT_API_ACCESS_TOKEN' \
  --header 'content-type: application/json'
If the request is successful, you’ll receive a response similar to the following. Sensitive fields such as hash.value will be redacted in the response.
[
    {
        "user": {
            "email": "test@test.io",
            "user_id": "7af4c65cb0ac6e162f081822422a9dde",
            "custom_password_hash": {
                "algorithm": "ldap",
                "hash": {
                    "value": "*****"
                }
            }
        },
        "errors": [
            {
                "code": "...",
                "message": "...",
                "path": "..."
            }
        ]
    }
]
Each error object will include an error code and a message explaining the error in more detail. The possible error codes are:
  • ANY_OF_MISSING
  • ARRAY_LENGTH_LONG
  • ARRAY_LENGTH_SHORT
  • CONFLICT
  • CONFLICT_EMAIL
  • CONFLICT_USERNAME
  • CONNECTION_NOT_FOUND
  • DUPLICATED_USER
  • ENUM_MISMATCH
  • FORMAT
  • INVALID_TYPE
  • MAX_LENGTH
  • MAXIMUM
  • MFA_FACTORS_FAILED
  • MIN_LENGTH
  • MINIMUM
  • NOT_PASSED
  • OBJECT_REQUIRED
  • PATTERN

Learn more