Tuesday, March 20, 2012

Another WTF for MS - Row Count Error

I try to get a Row Count from data flow - and get Error:
[Row Count [724]] Error: The variable "User::COUNT_NEWIMGS" specified by VariableName property is not a valid variable. Need a valid variable name to write to.

I tried setting COUNT_NEWIMGS on Package Level, on Data Flow level (Int32 datatype)
i tried specifying variable as "COUNT_NEWIMGS", "User::COUNT_NEWIMGS", "@.[COUNT_NEWIMGS]" - still the same error

And then when I'm ready to crash my keyboard (again) - I found on another forum that Row Count doesnt even update that variable until the dft is complete. WTF!!?!!

TheViewMaster wrote:

I try to get a Row Count from data flow - and get Error:
[Row Count [724]] Error: The variable "User::COUNT_NEWIMGS" specified by VariableName property is not a valid variable. Need a valid variable name to write to.

I tried setting COUNT_NEWIMGS on Package Level, on Data Flow level (Int32 datatype)
i tried specifying variable as "COUNT_NEWIMGS", "User::COUNT_NEWIMGS", "@.[COUNT_NEWIMGS]" - still the same error

Its case-sensitive. Could that be the problem?

TheViewMaster wrote:

And then when I'm ready to crash my keyboard (again) - I found on another forum that Row Count doesnt even update that variable until the dft is complete. WTF!!?!!

Well of course it doesn't. If you want to know how many rows are in a data path you have to wait until all the rows have gone through the data path before you know. How can you possibly have a complaint about that?

-Jamie

|||Please be sure that you don't have a trailing space when you type in the variable name...|||

Jamie Thomson wrote:

Well of course it doesn't. If you want to know how many rows are in a data path you have to wait until all the rows have gone through the data path before you know. How can you possibly have a complaint about that?

-Jamie

IDK - How the data flow was suppose to work that it takes in List of Images to Unzip - it checks in DB if those are required & if they are newer version. Then it uses Script component (Xceed) to unzip. Since - when you run it in BIDS - you can see no of records - I wanted to show status of progress e.g. 250 of 2500 images unzipped - 10% complete.

To my surprise - even when using zip component the most optimal way (creating an array of imagenames to unzip) - doing it in data flow was MUCH SLOWER than in Script task which unzipped images one-by-one.
Following is approximate time codes to unzip 767 images to a network share:
Data Flow - unzip 1-by-1: 2:55
Data Flow - array of imgs: 1:02
Custom Script task 1by1: 0:40

SO - thanks guys for help - but I'm not sold on data flow anymore as I have serious doubts about it's performance/usability (you can't even do a task like Update in reasonable time) - which leads me to use more SQL and application code & rely less on SSIS.

BTW - How many MS employees did it take to create the SSIS?|||

Jamie Thomson wrote:

TheViewMaster wrote:

Jamie Thomson wrote:

Well of course it doesn't. If you want to know how many rows are in a data path you have to wait until all the rows have gone through the data path before you know. How can you possibly have a complaint about that?

-Jamie


IDK - How the data flow was suppose to work that it takes in List of Images to Unzip - it checks in DB if those are required & if they are newer version. Then it uses Script component (Xceed) to unzip. Since - when you run it in BIDS - you can see no of records - I wanted to show status of progress e.g. 250 of 2500 images unzipped - 10% complete.
To my surprise - even when using zip component the most optimal way (creating an array of imagenames to unzip) - doing it in data flow was MUCH SLOWER than in Script task which unzipped images one-by-one.
Following is approximate time codes to unzip 767 images to a network share:
Data Flow - unzip 1-by-1: 2:55
Data Flow - array of imgs: 1:02
Custom Script task 1by1: 0:40
SO - thanks guys for help - but I'm not sold on data flow anymore as I have serious doubts about it's performance/usability (you can't even do a task like Update in reasonable time) - which leads me to use more SQL and application code & rely less on SSIS.
BTW - How many MS employees did it take to create the SSIS?

Huh? You're trying to unzip files within the data-flow? Why would you try and do that?

The data-flow is for moving data. Unzipping files is all about preparing data and no way should that be done in the pipeline. I'm worried that there's some documentation out there somewhere that leads you to think that you SHOULD be doing this. If there is, please point me to it and we'll go about getting it changed.

Why do you continually try and blame anyone but yourself for your own misunderstanding? I'm not trying to blame you for anything, if you misunderstand something then that's ok, let's try and put it right, but there are right and wrong ways of achieving things and in this instance you are trying to achieve something in the wrong way. You can't blame Microsoft for that.

And quite frankly I'm also sick to death of your sarcastic pithy little put-downs of Microsoft and the SSIS team as well. If you have gripes then address them in the right way but veiled insults to people that you've never met when you're hidden behind the anonymity of a computer screen isn't going to endear me to you and therefore I'm less likely to help you out (which I believe I have done quite a bit). Criticism is welcome if its constructive and valid. If its neither of those things, please don't bother.

There's a case in point right here on this thread. I see that you have marked Phil's original answer as correct which means that this was your mistake. Hardly warrants a title of "Another WTF for MS" does it?

You're right, you can't do updates from the pipeline particularly quickly. This is largely inherent in the very nature of an UPDATE and no ETL tool in the world will treat it any differently. With SSIS you have a perfectly valid and performant workaround - stage it to a transient table and issue a set-based update.

-Jamie

|||My bad - sorry I got overly frustrated.

Thanks for all the help - appreciate it

No comments:

Post a Comment