
Ssis-965 Now
is a defect that surfaces only in runtime , when the metadata (column names, data types, lengths, nullability) that SSIS builds at design‑time does not match the actual schema that the source delivers at execution. The error message looks like:
class FlowBuilder
// Locate Data Flow Task (by name) var dfTask = (TaskHost)pkg.Executables .Cast<Executable>() .First(e => ((TaskHost)e).Name == "DF_LoadDynamic");
var pipeline = (MainPipe)dfTask.InnerObject; var source = pipeline.ComponentMetaDataCollection.New(); source.ComponentClassID = "DTSAdapter.FlatFileSource"; SSIS-965
// Add OLE DB Destination similarly... pipeline.ReinitializeMetaData(); pkg.Save();
// Configure source connection (assume connection manager already exists) var cm = pkg.Connections["FlatFileConn"]; source.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(cm); source.RuntimeConnectionCollection[0].ConnectionManagerID = cm.ID;
static void Main(string[] args) string pkgPath = args[0]; // Path to master package string schemaFile = args[1]; // JSON schema var pkg = Application.LoadPackage(pkgPath, null); is a defect that surfaces only in runtime
contains an additional column Region at the end:
Error 0xC0202009 at Data Flow Task, OLE DB Source [1]: The data type of column "CustomerID" is unknown. Consequences:
$schema = @() foreach($col in $headers) $schema += [pscustomobject]@ ColumnName = $col.Trim() DataType = 'nvarchar(4000)' # default, can be refined later Nullable = $true | Adds an extra task; may still fail
| Work‑around | Description | Pros | Cons | |-------------|-------------|------|------| | – set RetainSameConnection = False on the Connection Manager and add a dummy Execute SQL Task that runs SELECT 1 before the Data Flow. | Causes the connection manager to be re‑created at runtime, forcing a new schema read. | Simple; no code changes. | Adds an extra task; may still fail if file is swapped after the dummy task runs. | | B. Use a Staging Table – Load the file into a wide staging table with a varchar(max) column for each field, then perform a set‑based INSERT…SELECT into the final destination after schema validation. | Decouples file schema from the Data Flow; you can validate columns via T‑SQL. | Robust; easy to log errors. | Additional I/O; extra storage; slower for very large files. |
$schema | ConvertTo-Json -Depth 3 | Set-Content -Path "$FilePath.schema.json" Write-Host "Schema written to $FilePath.schema.json" using Microsoft.SqlServer.Dts.Runtime; using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using System.IO; using Newtonsoft.Json.Linq;
