Skip To Content

Move a geoprocessing service's jobs directory to Microsoft Azure storage

At ArcGIS Server 10.7, administrators can change the jobs directory of a geoprocessing service (or multiple geoprocessing services) from a disk location to a Microsoft Azure cloud storage location, leaving other server directories unchanged. If your geoprocessing services consistently have large outputs, you can use this option to scale your storage resources.


If the View output in map image layer option was turned on when you published the web tool, you cannot use cloud storage as the job directory for the resultant geoprocessing service. The service will be corrupted once you make the change in this workflow.

Prepare the Azure environment

You need a Microsoft Azure account to create a storage account and Blob containers and tables.

Create an Azure storage account

The storage account must meet the following requirements:

  • A standard performance storage account is required.
  • This account can be a General-purpose v2 (recommended) or General-purpose v1 account. Blob storage, Block Blob storage, and Azure Files storage accounts are not supported.
  • The Hot access tier is recommended.
  • Other advanced settings of the storage account can be adjusted based on your organization's needs.

Once the storage account is deployed, copy the key1 of the access keys of your storage account, which is needed when you register the account as a cloud store with ArcGIS Server.

Create a Blob container and a table

Create a Blob container and a table in the same storage account. The geoprocessing service cannot identify them if they're in different storage accounts.

Optionally, you can create a unique queue for each asynchronous geoprocessing service. If you choose to do so, you must add "jobsStoreQueue":"yourqueuename" to the serviceProperties for each service.

Note the exact name of the container, table, and the optional queues you create; you'll use them in the following steps.

Move the jobs directory to Azure

Once the Azure Blob container and the table are deployed, register the Blob container in ArcGIS Server and change the service properties accordingly.

  1. Sign in to ArcGIS Server Administrator Directory and navigate to Register Item
  2. Provide the connection information of your Azure Blob container and table as a JSON. Reference the sample below.
  3. Return to the home page of the Administrator Directory, and click Services.
  4. Locate the geoprocessing service you want to configure to use the Azure Blob container, click the service name, and click edit.
  5. In the JSON representation of the service, add the following keypair with the name of your cloud store:

    "jobsDirectory":"/cloudStores/<name of your cloud store>"

    The name of the cloud store is at the end of its data item URL endpoint in the Administrator Directory.

  6. Click Save Edits to confirm. The geoprocessing service will automatically restart, which takes a moment.
  7. If you're configuring multiple geoprocessing services to use the Azure Blob container as their jobs directory, repeat steps 4–6 for each service.

JSON example

In this example, replace the dataname, id, myaccountkey, storageaccountname, containername, optionalfoldername, and tablename with your artifacts.

Register Item

    "path": "/cloudStores/dataname",
    "type": "cloudStore",
    "id": "8808a800-1585-4109-ae2f-7f12f1edac56",
    "provider": "azure",
    "info": {
      "isManaged": false,
      "connectionString": "{\"accountKey\":\"myaccountkey\",\"accountName\":\"storageaccountname\",\"defaultEndpointsProtocol\":\"https\",\"accountEndpoint\":\"\",\"credentialType\":\"accessKey\"}",
      "objectStore": "containername/optionalfoldername",

Then, change the service properties JSON of your geoprocessing service by adding "jobsDirectory": "/cloudStores/dataname",.

Edit GPServer.

 "serviceName": "myGPService1",
<... removed to save space ...>
  "resultMapServer": "false",
  "maximumRecords": "1000",
  "virtualOutputDir": "/rest/directories/arcgisoutput",
  "jobTableStore": "/cloudStores/azure",
  "outputStore": "/cloudStores/azure",
  "jobObjectStore": "/cloudStores/azure",
  "jobsDirectory": "/cloudStores/dataname",
  "portalURL": "https://domain/webadaptor/",
  "toolbox":  <... removed to save space ...>
 "portalProperties": < ...removed to save space... >,
 "extensions": < ...removed to save space... >,
 "frameworkProperties": {},
 "datasets": []