value too large to fit column data area of the buffer

Error: Microsoft.SqlServer.Dts.Pipeline. DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline. ScriptComponentHost.HandleUserException (Exception e)
at Microsoft.SqlServer.Dts.Pipeline. ScriptComponentHost.ProcessInput (Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline. ManagedComponentHost.HostProcessInput (IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)

Isn’t it a nice error message? No line number, no variable name, nothing!

I got this message in a DTSX package, where I import data from a CSV file into a MS SQL database with a transformation script in between.

But how can I find out which of column length doesn’t fit without recreating all column metadata?

ATTENTION: THIS IS SOLUTION 1, IT WORKED ONE TIME, THE WEEK AFTER SOLUTION 2 SOLVED THE SAME PROBLEM

Solution 1

Determine maximum field lengths

  1. open the CSV file in Excel
  2. split text to columns
  3. now enter the following formula to the end of each column:
    =MAX(LEN(A2:A30000))
    I assume you use your own cell range instead of A2:A30000
  4. now press Ctrl+Shift+Enter because it’s an array formula
    Excel will surround the formula with {}
    (never heard about this feature until reading this post today)

Update maximum field size in DTSX package

  1. open the flat file source from your connection manager  in Visual Studio
  2. select “Advanced” on the left and compare each column metadata with the excel results.
    normally you shouldn’t use the “Suggest Types…” because it replaces all column metadata with the exact maximum field length, which is not suitable for a file source that changes every day.
  3. if you found the wrong column change the field length step by step for each part of your data flow task.

I don’t think this is the most beautiful solution, but it’s the first one that worked for me.

Solution 2

I had the same error message again, but solution 1 didn’t change anything.

The error was thrown by a script task in a data flow task.

I compared the sizes of the input and output columns of the script task ( right click -> show advanced editor… ) and found input columns that were greater than the output columns.

change either the input or the output column width and it will work.

Error: Microsoft.SqlServer.Dts.Pipeline. DoesNotFitBufferException: The value is too large to fit in the column data area of the buffer.
at Microsoft.SqlServer.Dts.Pipeline. ScriptComponentHost.HandleUserException (Exception e)
at Microsoft.SqlServer.Dts.Pipeline. ScriptComponentHost.ProcessInput (Int32 inputID, PipelineBuffer buffer)
at Microsoft.SqlServer.Dts.Pipeline. ManagedComponentHost.HostProcessInput (IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)

Isn’t it a nice error message? No line number, no variable name, nothing!

I got this message in a DTSX package, where I import data from a CSV file into a MS SQL database with a transformation script in between.

But how can I find out which of column length doesn’t fit without recreating all column metadata?

ATTENTION: THIS IS SOLUTION 1, IT WORKED ONE TIME, THE WEEK AFTER SOLUTION 2 SOLVED THE SAME PROBLEM

Solution 1

Determine maximum field lengths

  1. open the CSV file in Excel
  2. split text to columns
  3. now enter the following formula to the end of each column:
    =MAX(LEN(A2:A30000))
    I assume you use your own cell range instead of A2:A30000
  4. now press Ctrl+Shift+Enter because it’s an array formula
    Excel will surround the formula with {}
    (never heard about this feature until reading this post today)

Update maximum field size in DTSX package

  1. open the flat file source from your connection manager  in Visual Studio
  2. select “Advanced” on the left and compare each column metadata with the excel results.
    normally you shouldn’t use the “Suggest Types…” because it replaces all column metadata with the exact maximum field length, which is not suitable for a file source that changes every day.
  3. if you found the wrong column change the field length step by step for each part of your data flow task.

I don’t think this is the most beautiful solution, but it’s the first one that worked for me.

Solution 2

I had the same error message again, but solution 1 didn’t change anything.

The error was thrown by a script task in a data flow task.

I compared the sizes of the input and output columns of the script task ( right click -> show advanced editor… ) and found input columns that were greater than the output columns.

change either the input or the output column width and it will work.