What do you do to address this:
[OLE DB Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "objectName1" needs to be updated in the external metadata column collection.
A corollary question: what does right-clicking a package in Solution Explorer and clicking "Reload with Upgrade" do?
mruniqueid wrote:
A corollary question: what does right-clicking a package in Solution Explorer and clicking "Reload with Upgrade" do?
I can answer this one - the versioning story for tasks allows ISVs (or Microsoft) to release completely new version of the task (new assembly version for managed, or new CLSID for native tasks). Designer doesn't automatically upgrade package to use new task version, as it will probably break package compatibility with systems that have old task version installed. Reload with Upgrade command tells the designer to try to upgrade tasks to new version.
|||mruniqueid wrote:
What do you do to address this:
[OLE DB Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "objectName1" needs to be updated in the external metadata column collection.
This means the metadata of your source column ("objectName1") had changed in the source table or maybe you added this column after configuring your OLE DB source component.
You should try to double-click on your OLE DB Source and the pop-up window, with a question if you want to automatically synchronize the metadata, will show up.
HTH.
|||Thanks - works as advertised.
In the meantime I found something interesting that wasn't immediately obvious. I have a foreach container with an item enumeration. The enumeration collection is a list of views, all with the same "basic" definition (I'll explain "basic" in a minute). The container has a data flow task that opens a view (with the view name defined by a variable from the foreach container) as an OLE DB source and copies the view contents to one OLE DB destination.
I kept getting errors about metadata differences and tracked the error to the view definitions. All views have the same column data type definijtions but some of the view columns are literals with differing text lengths, such as 'person' (6 chars), 'location' (8 chars), and so on. To solve the problem the view definition had to be modified to cast these literals to the destination column length, such as
CAST('person' AS varchar(50)) AS objectName
All of the views have to have exactly the same data types and lengths in order for the package to work.
Just wanted to pass this along in case anybody else runs into the problem
|||Correct. A data-flow is bound to the metadata at design-time. The expected metadata cannot change at runtime.
-Jamie
|||My project has a flat file input and the destination table is using nvarchar(xx).When I drag the Data Conversion tool on the Data Flow tab it makes the output alias for each field "Copy of original field name". Why is this? How can I make it stop prepending "Copy of"?
Next it makes the width of every field 50.
Naturally, the next step complains because I am attempting to put 50 characters in to a two character state field.
Editing every line is tedious. I have to remove "Copy of" and change the size. So when I get done and have it working, I fee that battle has been won. But today I have the subject error and have not been successful in getting rid of the error because there is no OLE DB connection to double click and find yet another hidden button.
If I have to delete the Data Conversion tool instance (because it apparently has lost its mind) then I will be forced to do all that editing again.
There needs to be a comprehesive, contains all clicks menu for this tool.
There has got to be "some" place where I can change that default 50 to some friendlier number like the average width of the first 100 rows for that column.
Is there no other way to clear the external metadata error?
|||
IanO wrote:
My project has a flat file input and the destination table is using nvarchar(xx).
When I drag the Data Conversion tool on the Data Flow tab it makes the output alias for each field "Copy of original field name". Why is this? How can I make it stop prepending "Copy of"?Next it makes the width of every field 50.
Naturally, the next step complains because I am attempting to put 50 characters in to a two character state field.Editing every line is tedious. I have to remove "Copy of" and change the size. So when I get done and have it working, I fee that battle has been won. But today I have the subject error and have not been successful in getting rid of the error because there is no OLE DB connection to double click and find yet another hidden button.
If I have to delete the Data Conversion tool instance (because it apparently has lost its mind) then I will be forced to do all that editing again.
There needs to be a comprehesive, contains all clicks menu for this tool.
There has got to be "some" place where I can change that default 50 to some friendlier number like the average width of the first 100 rows for that column.
Is there no other way to clear the external metadata error?
I understand your pain and I have similar complains....
The Data conversion transformation adds a 'Copy of..' to each column because it preserves the original column; if you decided to remove the Copy of part of the name; then SSIS is forced to use a multipart name; as the name of the column is duplicate within the dataflow; so you can see things like (OLE DB Source.MyColumn and Data Convertion.MyColumn). So, I don’t recommend you to edit the names as it gets harder when you are mapping them in the destination components.
Please, feel free to post a suggestion in Microsoft Connect site:
http://connect.microsoft.com/SQLServer/Feedback
|||Thanks for your reply.
I had no idea it was ,,,beneficial to leave the "Copy of" prefix on the output names,
I went back to Kirk's book. If you have it, see page 479 from which I quote:
"You should avoid using this component if possible. If possible, change the column type at the source."
If there existed the magic all things menu that I wished for in my posting, it would allow access to all the features without digging deep to find a context or builder button, I might be able to refresh the meta data.
Since I posted, I have changed some of the fields data types to DT_WSTR.
Then I clicked the connection to the destination to see what types were being passed. My changes were not present in the connection stream. I saved the package, closed it and reopened. Still the changes are not propagating to the destination. I am wondering If an entire new package will make it work?
IanO
|||
IanO wrote:
My project has a flat file input and the destination table is using nvarchar(xx).
When I drag the Data Conversion tool on the Data Flow tab it makes the output alias for each field "Copy of original field name". Why is this? How can I make it stop prepending "Copy of"?
You can't. Its a new column so it has to call it something. "Copy of XXX" is as arbitrary as anything else.
IanO wrote:
Next it makes the width of every field 50.
Naturally, the next step complains because I am attempting to put 50 characters in to a two character state field.
Naturally it does. So your new column will have to be changed to the data type that you require. Should the development tools simply guess what length to put on there? Of course not, so it puts an arbitrary value of 50 in there. SSIS will never make changes for you - you are always responsible for making changes yourself (this is a good thing by the way)
IanO wrote:
Editing every line is tedious. I have to remove "Copy of" and change the size.
Hopefully you understand why this is unavoidable. YOU have to tell SSIS what the data type of each column should be - it will not try and guess.
IanO wrote:
So when I get done and have it working, I fee that battle has been won. But today I have the subject error and have not been successful in getting rid of the error because there is no OLE DB connection to double click and find yet another hidden button.
Sorry, I'm a bit confused on this one. What button are you referring to on the OLE DB Connection Manager?
IanO wrote:
If I have to delete the Data Conversion tool instance (because it apparently has lost its mind) then I will be forced to do all that editing again.
There needs to be a comprehesive, contains all clicks menu for this tool.
There has got to be "some" place where I can change that default 50 to some friendlier number like the average width of the first 100 rows for that column.
No, there is no place. Putting an average in like you suggest is a no-no because the would constitute SSIS guessing and as I explained that is something it won't do hence an arbitrary value is better. Besides, if it COULD do the average of the first 100 rows (which it cant because at design-time no actual rows exist in the pipeline - so where are these 100 rows that you are referring to?) that would create an error for any of the rows that exceeded that average - so that isn't a good idea.
What *might* be a good idea would be to have the option of selecting what the arbitrary value should be. That could be an option you set within BIDS. I do however see problems with this if different developers have this option set differently. For safety's sake I think the behaviour is just right as it is currently.
IanO wrote:
Is there no other way to clear the external metadata error?
I don't think so. The developer is responsible for setting what the external metadata should be.
Hope that helps.
-Jamie
|||
IanO wrote:
Thanks for your reply.
I had no idea it was ,,,beneficial to leave the "Copy of" prefix on the output names,
I think what Rafael means by this is that its not a good idea to change the name of the new column to be the same as the one that it is derived from as this causes confusion later in the pipeline.
IanO wrote:
I went back to Kirk's book. If you have it, see page 479 from which I quote:
"You should avoid using this component if possible. If possible, change the column type at the source."
I agree with the assertion that you should change at source if necassary although I'm not sure that avoidance is a good strategy. This component has its uses in certain circumstances. I don't have the book to hand so can't read the context in which Kirk stated this.
IanO wrote:
If there existed the magic all things menu that I wished for in my posting, it would allow access to all the features without digging deep to find a context or builder button, I might be able to refresh the meta data.
Since I posted, I have changed some of the fields data types to DT_WSTR.
Then I clicked the connection to the destination to see what types were being passed. My changes were not present in the connection stream. I saved the package, closed it and reopened. Still the changes are not propagating to the destination. I am wondering If an entire new package will make it work?
IanO
The reason changes don't get propogated will be because you have an asynchronous component such as a SORT or an AGGREGATE. I won't go into the details of asynchronous components (there's plenty of info out there though: http://search.live.com/results.aspx?q=ssis+asynchronous+synchronous&mkt=en-GB&form=QBNO) but you'll have to take my word for it that this is unavoidable because of the vary nature of the components that you are using. Any column that is input into an asynchronous component is NOT the same as any column that is output from it - even if they have the same name.
Its annoying, yes, but that's the way it is I'm afraid. If you want a fuller explanantion then let me know and I'll do my best to provide one.
-Jamie
No comments:
Post a Comment