COPY INTO for Azure SQL Managed Instance (MI)
COPY INTO is the fast, cloud-native way to bulk-load files from Azure Blob Storage/ADLS into a table on
Azure SQL Managed Instance. It supports parallel loads, CSV/Parquet, compressed files, and secure auth
via Managed Identity or SAS.
Prerequisites
- Target: Azure SQL Managed Instance (not Azure SQL Database).
- Files in Azure Blob Storage or ADLS Gen2.
- Table already created; column mapping is positional.
- Auth: MI with “Storage Blob Data Reader” on the container, or a SAS token.
Create Target Table (example)
CREATE TABLE dbo.Sales
(
SaleId int NOT NULL,
SaleDate date NOT NULL,
CustomerId int NOT NULL,
Amount decimal(18,2) NOT NULL
);
Option 1 — COPY INTO using Managed Identity (recommended)
Grant your SQL MI’s managed identity Storage Blob Data Reader on the container, then run:
COPY INTO dbo.Sales
FROM 'https://<storageaccount>.blob.core.windows.net/<container>/imports/'
WITH (
FILE_TYPE = 'CSV', -- or 'PARQUET'
CREDENTIAL = (IDENTITY='Managed Identity'),
FIRSTROW = 2, -- skip header
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0A', -- LF; use 0x0D0A for CRLF
MAXERRORS = 10,
ERRORFILE = 'https://<storageaccount>.blob.core.windows.net/<container>/errors/'
);
Option 2 — COPY INTO using a SAS token
Use a container/file SAS in the URL. Keep it short-lived and least-privileged.
COPY INTO dbo.Sales
FROM 'https://<storageaccount>.blob.core.windows.net/<container>/imports/?sv=2024-..&ss=bfqt&sp=rl..&sig=...'
WITH (
FILE_TYPE = 'CSV',
CREDENTIAL = (IDENTITY='Shared Access Signature'),
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR = '0x0A',
MAXERRORS = 0
);
Parquet Example (auto schema where possible)
COPY INTO dbo.Sales
FROM 'https://<storageaccount>.blob.core.windows.net/<container>/parquet/'
WITH (
FILE_TYPE = 'PARQUET',
CREDENTIAL = (IDENTITY='Managed Identity')
);
Tips & Gotchas
- Loads all files under the path in parallel; use a subfolder per batch if needed.
- For CSV, set correct
FIRSTROW,FIELDTERMINATOR, andROWTERMINATOR; UTF-8 is default. - Use
ERRORFILEto capture bad rows for troubleshooting. - If your files are gzipped, COPY can read them directly.
- For repeatable pipelines, pair with ADF/Synapse pipelines or a SQL Agent job on MI.

















