At ArcGIS Server 10.7, administrators can change the jobs directory of a geoprocessing service (or multiple geoprocessing services) from a disk location to a Microsoft Azure cloud storage location. If your geoprocessing services consistently have large outputs, you can use this option to scale your storage resources.
Note:
If the View output in map image layer option was turned on when you published the web tool, you cannot use cloud storage as the job directory for the resultant geoprocessing service. The service will be corrupted once you make the change in this workflow.
Prepare the Azure environment
You need a Microsoft Azure account to create a storage account and Blob containers and tables.
Create an Azure storage account
The storage account must meet the following requirements:
- A standard performance storage account is required.
- This account can be a General-purpose v2 (recommended) or General-purpose v1 account. Blob storage, Block Blob storage, and Azure Files storage accounts are not supported.
- The Hot access tier is recommended.
- Other advanced settings of the storage account can be adjusted based on your organization's needs.
Once the storage account is deployed, copy the key1 of the access keys of your storage account, which is needed when you register the account as a cloud store with ArcGIS Server.
Create a Blob container and a table
Create a Blob container and a table in the same storage account. The geoprocessing service cannot identify them if they're in different storage accounts.
You also need to create a unique queue for each asynchronous geoprocessing service. Add "jobsStoreQueue":"<name of the queue>" to the serviceProperties for each service.
Note the exact name of the container, table, and the optional queues you create; you'll use them in the following steps.
Move the jobs directory to Azure
Once the Azure Blob container and the table are deployed, register the Blob container in ArcGIS Server and change the service properties accordingly.
- Sign in to ArcGIS Server Administrator Directory and browse to Register Item
- Provide the connection information of your Azure Blob container and table as a JSON. Reference the sample below.
- Return to the home page of the Administrator Directory, and click Services.
- Locate the geoprocessing service you want to configure to use the Azure Blob container, click the service name, and click edit.
- In the JSON representation of the service, add the following keypairs with a new unique serviceId, the name of your cloud store, and the queue for that service:
"serviceId": "<a unique service ID>", "jobQueueStore":"/cloudStores/<name of your cloud store>", "jobTableStore": "/cloudStores/<name of your cloud store>", "outputStore": "/cloudStores/<name of your cloud store>", "jobObjectStore": "/cloudStores/<name of your cloud store>", "jobsStoreQueue": "<name of the queue>",
Tip:
The name of the cloud store is at the end of its data item URL endpoint in the Administrator Directory.
- Click Save Edits to confirm. The geoprocessing service will automatically restart, which takes a moment.
- If you're configuring multiple geoprocessing services to use the Azure Blob container as their jobs directory, repeat steps 4 through 6 for each service.
JSON example
In this example, replace the dataname, myaccountkey, storageaccountname, containername, optionalfoldername, and tablename with your artifacts.
Register Item{
"path": "/cloudStores/dataname",
"type": "cloudStore",
"provider": "azure",
"info": {
"isManaged": false,
"connectionString": {
"accountKey":"myaccontkey",
"accountName":"mystorageaccountname",
"defaultEndpointsProtocol":"https",
"accountEndpoint":"core.windows.net",
"credentialType":"accessKey"
},
"objectStore": "containername/optionalfoldername",
"tableStore":"tablename"
}
}
Change the service properties JSON of your geoprocessing service by adding keyvalue pairs required in step 5 above.
Edit GPServer.
{
"serviceName": "myGPService1",
<... removed to save space ...>
"resultMapServer": "false",
"maximumRecords": "1000",
"virtualOutputDir": "/rest/directories/arcgisoutput",
<... below is the new key-values needed ...>
"serviceId": "this_is_a_unique_serviceid",
"jobQueueStore":"/cloudStores/azure",
"jobTableStore": "/cloudStores/azure",
"outputStore": "/cloudStores/azure",
"jobObjectStore": "/cloudStores/azure",
"jobsStoreQueue": "this_is_a_unique_queue_name",
<... end of new key-values needed ...>
"portalURL": "https://domain/webadaptor/",
"toolbox": <... removed to save space ...>
},
"portalProperties": < ...removed to save space... >,
"extensions": < ...removed to save space... >,
"frameworkProperties": {},
"datasets": []
}