I am in the midst of updating data in multiple tables. Currently I have a table that has one field, "sources", that is just a list of all tables that include the field "itemid". I also have a table that has 2 fields, "itemid" and "olditemid". In TSQL, I would like to iterate through the sources and create the update statements on the fly. Here is what I was trying to do but I get some errors in the update statement that my variable is not declared. I am not sure this is even close the correct way I should be doing this. Ideas?
DECLARE @tblName varchar(50)
DECLARE process_cursor CURSOR FOR
SELECT source
FROM tmpTableNames
OPEN process_cursor
FETCH NEXT FROM processcursor
INTO @tblName
WHILE @@FETCH_STATUS = 0
UPDATE @tblName
SET itemid = r.itemid
FROM @tblName v, itemref r
WHERE r.olditemid = v.itemid
FETCH NEXT FROM process_cursor
INTO @tblName
END
CLOSE processcursor
DEALLOCATE processcursor
What you are trying to do is referred to as "dynamic SQL". While you're on the right track, you can't simply stick a variable in place of an object name and execute the query. I'll leave the pitfalls of dynamic SQL to someone else. What you're looking for is this:
DECLARE @tblName varchar(50)
DECLARE process_cursor CURSOR FOR
SELECT source
FROM tmpTableNames
OPEN process_cursor
FETCH NEXT FROM processcursor
INTO @tblName
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @sql NVARCHAR(500)
SELECT @sql = 'UPDATE [' + @tbleName + '] SET itemid = r.itemid FROM [' + @tbleName + '] v, itemref r WHERE r.ilditemid = v.itemid'
EXEC sp_executesql @sql
FETCH NEXT FROM process_cursor
INTO @tblName
END
CLOSE processcursor
DEALLOCATE processcursor
What this does is turn your update query into a string, then passes the SQL contained in that string to the sp_executesql
stored procedure (this is the recommended way of executing dynamic sql, rather than EXEC('foo')
).