Using ADO.NET bulk update to create data on MSSQL.
This example follows AdventureWorks naming conventions
For local development, launch a docker instance:
docker run -e "ACCEPT_EULA=Y" -e "MSSQL_SA_PASSWORD=Str0ngP4ssword#2023" --name mssql-powerapps -p 1433:1433 -d mcr.microsoft.com/mssql/server:2022-latest
For real performance testing, start a database on Azure.
Make sure you have the Bicep latest release:
az bicep upgrade
Create the resources on Azure:
💡 Edit security and location parameters according to your needs - this code is only for demo purposes
# Edit the required values
cp config-template.json config.json
# Run the Bicep template
az deployment group create --resource-group powerapps --template-file main.bicep
Switching the DTU capacity during benchmark:
# Options with DTU are S0(10), S1(20), S2(50), S3(100)...
az sql db update -g powerapps -s sqlpowerappsbenchmark -n sqldbbenchmark --service-objective S3 --max-size 250GB
# Go back to Basic to save costs
az sql db update -g powerapps -s sqlpowerappsbenchmark -n sqldbbenchmark --service-objective Basic --max-size 2GB
ℹ️ Read ore about capacity options in the purchasing models documentation.
💡 DTU model supports columnstore indexing starting from
S3
and above
The sample schema compatible with this code is available in the schema.sql
file.
Use your fav IDE such as Azure Data Studio to create the objects.
After creating the schema, enter the console app directory to set it up.
cd dataload
cp template.env .env
Get the database connection string:
az sql db show-connection-string -s sqlpowerappsbenchmark -n sqldbbenchmark -c ado.net
Add the connection string to the .env
file, replacing the username and password.
Run the application:
dotnet restore
dotnet run
The console app will use bulk update to create the data.
When finished, delete the resources:
az group delete -n powerapps -y