Showing posts with label production. Show all posts
Showing posts with label production. Show all posts

Thursday, March 22, 2012

(Design) Production db used as the data warehouse?

We're designing our first bi suite and we're considering not using a data warehouse at all but connecting directly to the production database via the DS. I have a nagging feeling that inherently this is not a good idea but would welcome pros and cons.

Pros
- real time data updates as fast as we can Process
- no need for ETL
- we can write named queries for use in the DSV to satisfy our data requirements

Cons
- performance hit to non BI users when BI users report if using ROLAP partitions
- potential table locking during Processing, again affecting non BI users

My gut feel is that instead we should be keeping a synchronized copy of Production as a DW to report off, but I will need stronger Cons that those above to convince my boss.

Be gentle please - my first post and I completed my fist SSAS course only yesterday. Thanks.

Are your production db home made or purchased solution from sombody?

Is the data quality 100% so you don't need data clearing and validation?

Are there any updates/upgrades of production db structure?

Are you sure, that you production db is only source of your data warehouse and there will be no another sources in next years?

How large is the volume of your production db?

How large it will be in 3-5 years?

Do you plan to archive some old transaction data from your production db?

|||

Thanks Vlad, some good points here:

Are your production db home made or purchased solution from sombody?

- The production database is also ours.


Is the data quality 100% so you don't need data clearing and validation?

- Yes, and any changes that need to be made through the interfaces would be made in the source (production) database (which would then flow through).


Are there any updates/upgrades of production db structure?

Cheers I hadn't considered this - if we change the DDL of the production database during version upgrades then this will potentially have an affect on the DSV's.


Are you sure, that you production db is only source of your data warehouse and there will be no another sources in next years?

At this stage yes - although if it's very successful I could see clients wanting to report on other external sources like their upstream ERP's. I'm not sure how having the production database as the data warehouse could relate to this though...

How large is the volume of your production db?

Our largest production databases are about 5GB.


How large it will be in 3-5 years?

Hmm, hard to say but would guess 10GB?


Do you plan to archive some old transaction data from your production db?

Very good point - for customers who have been using the software for some time and have a large amount of history they may want to use the DW as an archive and remove data from production - the approach I outlined about does not allow this easily (we would have to write a number of date queries restricting the data.


Considering the lukewarm response I got to this thread perhaps what is being proposed is not such a big deal/poor option - if I consider the cons above it sounds like we may be able to implement it in such a way. I must admit I still have a bad get feel about it but can't really pin down exactly why...


Has anyone seen cubes based directly off their production database (i.e. not a copy of it) in a real world environment? Would love some more feedback...

|||

Hi,

I have seen many SSAS solutions, and a some of them direct connected to production db. but no one of direct connected was ERP db.

I depends on your businnes. Is you production db a ERP like db, or what else? If ERP like, then you must have DWH, if you what to sleep relaxed :-) another way you get enough headache

|||It's an OLTP database but not high volume, only dozens of transactions an hour. But we're thinking of only processing nightly, and setting an expectation with the users that this will be the case.

Monday, March 19, 2012

"Server: The operation has been cancelled." When Processing Cube

There are several other posts around this but none really seem to point to a solution. Here is my situation:

1. We've been running in production with SP1 of Standard Edition with no problems since October (4GB, Windows Server Standard, 4 fact tables, 20 million rows in largest fact)

2. I applied SP2 to development last week. While working in dev I saw the "Server: The operation has been cancelled." error for the first time while I was making some dimension changes (i'd never seen this error in SP2). I tried to process the cube several times and it always errored this way. I backed out my changes, processed the cube and all was well. I re-applied some of my changes and all was still well. I decided to upgrade prod since I thought the problem at this point was some bad configuration of the dimension.

3. We upgraded two servers in prod with the exact same hardware configuration, but with SP2 now and Standard edition (NOTE: The cube structure was not changed). We've been processing fine since Monday's load until last night. Last night one of the servers failed during the cube processing step with the "Server: The operation has been cancelled." error.

4. I've got the process running again after restarting the analysis services process...only thing I could think to try right now. Hopefully it works...i didn't see any posts with any type of guidance that might help

Has anyone else seen a problem where what worked in SP1 stopped working in SP2? Are most people running SP2 using the most recent patch rollup? Should I move to that? Any hope this issue was solved in one of the patch rollups?

Thanks.

Sounds like you might have a commitTimeout set, if you are seeing these operation cancelled errors from from the processing task. As far as I can tell the default setting for ForceCommitTimeout was changed in SP2, but I did not think there was one for the CommitTimeout. See this post for more information on these two settings http://geekswithblogs.net/darrengosbell/archive/2007/04/24/SSAS-Processing-ForceCommitTimeout-and-quotthe-operation-has-been-cancelledquot.aspx|||

Thanks for the response. I had already tried that with no luck based on other posts I've seen. I saw one posting where the person said they have to restart the services every night. I just set EVERY timeout in the properties (by choosing advanced) to 0...I just had a Cancel Operation in dev again. I'll try it with all the timeouts set to 0 to see what hapepns.

Thanks.

|||This problem is getting really annoying. On one server it has never happened, on another it happens about every other day. If I restart the service then the process runs through. Note that in dev restarting the service doesn't always fix it...sometimes it just takes trying to run it multiple times. Something in SP2 broke SSAS processing...I never had a single problem in SP1.|||

I'm surprised no one else is seeing these issues with SP2. I had no issues with cube processing until I applied SP2. I just got a different error. Note that so far in every case if I just restart the service then the next cube load works fine. Is there any other tracing I can turn on to help identify what the issue is?

OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056964601,0x,Internal error: The operation terminated unsuccessfully.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056964601,0x,Internal error: The operation terminated unsuccessfully.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056767999,0x,Memory error: Allocation failure : Not enough storage is available to process this command. .
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056767999,0x,Memory error: Allocation failure : Not enough storage is available to process this command. .
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932978,0x,Errors in the OLAP storage engine: An error occurred while processing the 'Tracking F' partition of the 'Tracking' measure group for the 'Warehouse' cube from the Cube database.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932978,0x,Errors in the OLAP storage engine: An error occurred while processing the 'Tracking F' partition of the 'Tracking' measure group for the 'Warehouse' cube from the Cube database.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932986,0x,Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932986,0x,Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.

|||

In the errors you have, it seems that this is the root cause:

Memory error: Allocation failure : Not enough storage is available to process this command

In case you have large dimensions, a common solution for out of memory errors is to reduce the parallelism for processing dimension attributes. There is the server property CoordinatorExecutionMode; the default is -4 which means that server will process 4 x NumberOfProcs dimension attributes in parallel (so if you have many attribute members, processing them in parallel, even for a single dimension, will lead to out of memory). Try setting it to 1.

Adrian Dumitrascu

|||I've been having the same issue with two individual servers both running SP2, a third server running SP1 will still build the exact same cube without fail.

On both SP2 servers the cube simply hangs after reporting a couple of attribute key not found errors- even running a trace reveals no actual errors or further progress from this point and any attempts to stop the processing short of restarting the SSAS service fail.

I've tried changing ForceCommitTimeout and CoordinatorExecutionMode in vain.
|||Getting exactly the same problem. Applied SP2 and the cube starts processing but never completes with no obvious errors. Desperate for a solution.....

|||

Adrian, your response is not the answer. There are problems with SP2 that did not exist in SP1. I do have some updates.

1. I did try setting the flag to ignore the memory errors. I believe that this did make a difference.

2. I'm not completely sure about #1, because I also made a big cube change as well. A long time ago I had built a degenerate dimension as I needed to quickly implement some functionality. I was suspecting that this was what was causing the cube processing errors. I built a new dimension and updated by ETL to create a standard dimension to replace the degenerate dimension. Once this was done (crossing my fingers) the cube has built without any issue.

For the other people posting, are you using degenerate dimensions? Have you tried setting the MemoryLimitErrorEnabled to false?

Microsoft readers, please note the common thread though, these issues did not exist in SP1.

|||

I believe this error is due to lack of memory, either from lack of memory on your machine, or (more likely) from the 32-bit platform itself. To the best of my knowledge the 32-bit versions of analysis server can only utilize a maximum of 3 GB of memory, which includes page file (virtual memory) usage. In my own experience, these errors occur when I am processing a large cube or dimension and the memory usage (combined RAM and virtual memory) reaches about 2.5GB. I think setting the MemoryLimitErrorEnabled to false helps greatly, as does setting processing operations to run in parallel, but only one process at a time (for some reason this works much better than "sequential"). I think the only permanent solution is to upgrade the server to 64-bit, and that is what we are doing know with our server.

Restarting the service clears out the cache, which makes more application memory available, and likewise would make the process more likely to complete successfully.

"Server: The operation has been cancelled." When Processing Cube

There are several other posts around this but none really seem to point to a solution. Here is my situation:

1. We've been running in production with SP1 of Standard Edition with no problems since October (4GB, Windows Server Standard, 4 fact tables, 20 million rows in largest fact)

2. I applied SP2 to development last week. While working in dev I saw the "Server: The operation has been cancelled." error for the first time while I was making some dimension changes (i'd never seen this error in SP2). I tried to process the cube several times and it always errored this way. I backed out my changes, processed the cube and all was well. I re-applied some of my changes and all was still well. I decided to upgrade prod since I thought the problem at this point was some bad configuration of the dimension.

3. We upgraded two servers in prod with the exact same hardware configuration, but with SP2 now and Standard edition (NOTE: The cube structure was not changed). We've been processing fine since Monday's load until last night. Last night one of the servers failed during the cube processing step with the "Server: The operation has been cancelled." error.

4. I've got the process running again after restarting the analysis services process...only thing I could think to try right now. Hopefully it works...i didn't see any posts with any type of guidance that might help

Has anyone else seen a problem where what worked in SP1 stopped working in SP2? Are most people running SP2 using the most recent patch rollup? Should I move to that? Any hope this issue was solved in one of the patch rollups?

Thanks.

Sounds like you might have a commitTimeout set, if you are seeing these operation cancelled errors from from the processing task. As far as I can tell the default setting for ForceCommitTimeout was changed in SP2, but I did not think there was one for the CommitTimeout. See this post for more information on these two settings http://geekswithblogs.net/darrengosbell/archive/2007/04/24/SSAS-Processing-ForceCommitTimeout-and-quotthe-operation-has-been-cancelledquot.aspx|||

Thanks for the response. I had already tried that with no luck based on other posts I've seen. I saw one posting where the person said they have to restart the services every night. I just set EVERY timeout in the properties (by choosing advanced) to 0...I just had a Cancel Operation in dev again. I'll try it with all the timeouts set to 0 to see what hapepns.

Thanks.

|||This problem is getting really annoying. On one server it has never happened, on another it happens about every other day. If I restart the service then the process runs through. Note that in dev restarting the service doesn't always fix it...sometimes it just takes trying to run it multiple times. Something in SP2 broke SSAS processing...I never had a single problem in SP1.|||

I'm surprised no one else is seeing these issues with SP2. I had no issues with cube processing until I applied SP2. I just got a different error. Note that so far in every case if I just restart the service then the next cube load works fine. Is there any other tracing I can turn on to help identify what the issue is?

OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056964601,0x,Internal error: The operation terminated unsuccessfully.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056964601,0x,Internal error: The operation terminated unsuccessfully.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056767999,0x,Memory error: Allocation failure : Not enough storage is available to process this command. .
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1056767999,0x,Memory error: Allocation failure : Not enough storage is available to process this command. .
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932978,0x,Errors in the OLAP storage engine: An error occurred while processing the 'Tracking F' partition of the 'Tracking' measure group for the 'Warehouse' cube from the Cube database.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932978,0x,Errors in the OLAP storage engine: An error occurred while processing the 'Tracking F' partition of the 'Tracking' measure group for the 'Warehouse' cube from the Cube database.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,Analysis Services Processing Task,{F9170A49-2D23-4DA6-962C-CD55A57C191A},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932986,0x,Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
OnError,DWS569794SQL2,NT AUTHORITY\SYSTEM,master,{F9625461-CC09-4D2F-A3DE-6B64B9D0E230},{1E79FD64-2D1F-48A8-910B-30501E61BF0E},7/26/2007 7:08:48 AM,7/26/2007 7:08:48 AM,-1054932986,0x,Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.

|||

In the errors you have, it seems that this is the root cause:

Memory error: Allocation failure : Not enough storage is available to process this command

In case you have large dimensions, a common solution for out of memory errors is to reduce the parallelism for processing dimension attributes. There is the server property CoordinatorExecutionMode; the default is -4 which means that server will process 4 x NumberOfProcs dimension attributes in parallel (so if you have many attribute members, processing them in parallel, even for a single dimension, will lead to out of memory). Try setting it to 1.

Adrian Dumitrascu

|||I've been having the same issue with two individual servers both running SP2, a third server running SP1 will still build the exact same cube without fail.

On both SP2 servers the cube simply hangs after reporting a couple of attribute key not found errors- even running a trace reveals no actual errors or further progress from this point and any attempts to stop the processing short of restarting the SSAS service fail.

I've tried changing ForceCommitTimeout and CoordinatorExecutionMode in vain.
|||Getting exactly the same problem. Applied SP2 and the cube starts processing but never completes with no obvious errors. Desperate for a solution.....

|||

Adrian, your response is not the answer. There are problems with SP2 that did not exist in SP1. I do have some updates.

1. I did try setting the flag to ignore the memory errors. I believe that this did make a difference.

2. I'm not completely sure about #1, because I also made a big cube change as well. A long time ago I had built a degenerate dimension as I needed to quickly implement some functionality. I was suspecting that this was what was causing the cube processing errors. I built a new dimension and updated by ETL to create a standard dimension to replace the degenerate dimension. Once this was done (crossing my fingers) the cube has built without any issue.

For the other people posting, are you using degenerate dimensions? Have you tried setting the MemoryLimitErrorEnabled to false?

Microsoft readers, please note the common thread though, these issues did not exist in SP1.

|||

I believe this error is due to lack of memory, either from lack of memory on your machine, or (more likely) from the 32-bit platform itself. To the best of my knowledge the 32-bit versions of analysis server can only utilize a maximum of 3 GB of memory, which includes page file (virtual memory) usage. In my own experience, these errors occur when I am processing a large cube or dimension and the memory usage (combined RAM and virtual memory) reaches about 2.5GB. I think setting the MemoryLimitErrorEnabled to false helps greatly, as does setting processing operations to run in parallel, but only one process at a time (for some reason this works much better than "sequential"). I think the only permanent solution is to upgrade the server to 64-bit, and that is what we are doing know with our server.

Restarting the service clears out the cache, which makes more application memory available, and likewise would make the process more likely to complete successfully.

Friday, March 16, 2012

"sa" account and domain level rights

Hey everyone,
I apologize for the newbie question but I'm looking for the correct answer. We have 4 production SQL servers at this time. When we had originally set them up the "sa" account belonged to the domain administrators group. Since we have a SQL admin team and a domain admin team we would like to remove this privilege. Is this something we can and should do? Our SQL servers use mixed mode authentication and some databases are configured for Windows authentication. I would appreciate any input from the community.

You are mixing some things up:

"When we had originally set them up the "sa" account belonged to the domain administrators group."

As sa is a sql server login it cannot belong to any domain group.


"Our SQL servers use mixed mode authentication and some databases are configured for Windows authentication. "

Authentication takes place at the server level not the database level, so although you switched on Mixed Auth on Server level you can still have autorized users in the database who are authenticated via SQL Server authentication.

Server --Authentication
Database --Authorization


HT, Jens Suessmeyer.

|||Sorry for the confusion. Thanks for the response Jen. We have a domain account called "sa" which was used to create databases on our SQL server so it is the current db owner. We use Windows authentication so that credentials are passed via Active Directory from an end user's network logon to our SQL server. Also, we use the Active Directory "sa" account to log in to SQL and it allows our SQL administrators to create Active Directory / SQL accounts to facilitate Windows authentication. My concern is that if we remove domain administration rights from this account we risk breaking applications that have the "sa" account as the database owner.
|||

OK, lets clarify something.

For the User DOmain account sa, the following schema is valid:

Domain --Domain Adminstrator --> (inherits) local administrative rights on the SQL Server -- inherits serverole systemadministrator (in sql server because the local admin group and therefore also the domain admins) are systemadministrators --> inherits dbowner role on the database, because systemadministrator are in the role downers by default.

So breaking the chains:

-With revoking domain administrative rights from the user sa won′t break the chain if you grant him local admin rights.

-With revoking domain administrative rights from the user sa won′t, not giving him local administrative rights on the server won′t break the chain if you put him to the systemadmistrator role on the server.

-With revoking domain administrative rights from the user sa, not giving him local administrative rights on the server, not putting him in the role systeadminstrators break the chain, though he only has downer access to the database now.


It hoped I clarified that a bit. If you didn′t understand my explanation or you have a further question, don′t hesitate to ask :-)

HTH, Jens Suessmeyer.

|||That's a great explanation Jens. Thank you very much. We are going to schedule an off hours test. First step will be to add the domain "sa" account to the local administrator group on our SQL servers. The second step will be removing the account from the domain administrators group. Then we will test and hope for the best. From what I've read, I did not see anything about the "sa" account requiring domain level rights (only local). So, you re-affirmed my thinking. I really appreciate your input.
|||You don′t even need to put him in the local admin group. Local admins are by default in the systemadministrators groups. But you can also put specific users in that role. So you can further restrict the user to have local admin rights.

HTH, Jens Suessmeyer.

Sunday, March 11, 2012

"No description found" when saving a package

Hi everyone,

Primary platform is Framework 2.0 and Sql25k 64-bit

When I try to save my DTSX package from my client to the production server I get this error:

"No Description found"

Either Windows or Sql authentication.

Does anyone have ever faced this drawback?

Thanks in advance,

Any idea?

Is something related to Visio 2003? http://support.microsoft.com/kb/922546

Thanks again,

Monday, February 13, 2012

'sa' User Map error

Hello all,

I'm having a problem with my sa account, we just migrated from sql server 2000 to sql server 2005 and 1 of our production applications is failing on the sa login.

I checked the 'User Map' setup to the 'sa' account and the databases that need accessing aren't mapped.

When I try to add the map, I get error "Cannot use the special principal 'sa'. (Microsoft SQL Server, Error: 15405)". I get the same error when trying to remove a user map for sa (there are a couple db's already mapped), and I don't remember how I mapped them in the first place, but it certainly isn't working now!

Thanks in advance,
102020

What is ther error that you get back and for waht operation do you get it back? Please also check the server error log and copy the relevant errors logged there.

Thanks
Laurentiu

|||I have fixed this already, thanks though.

The problem was that the db owner was the domain admin and not sa, so I couldn't change the mapping.

Thanks,
102020
|||

Hi,

I have the same problem. I am trying to add sa to several databases created by other users than sa. We are getting error 15405:cannot use the reserved user or role name 'SA'.Microsoft SQL-DMO (ODBC-SQL State:42000)

Is there no solution to this problem? We are using SQL Server 2005

Regards Caroline|||

I found the solution!! Quit simple: exec sp_changedbowner 'sa','true'.

Thanks anyway

|||

Hey,

I find out how to change it using Management Stadio Express:

Right click the database to show properties. then change the owner under files to sa

then you can map it under Logins .

|||Caroline, thanks for posting this - it really helped me out!
|||Thanks Caroline, I've been searching for hours on how to do this, I kept geeting same as as above posts.
|||I've been living without database diagrams for one of my databases for the best part of a year since inheriting it from a previous developer. Now I can finally see the relationships between the tables. Thanks CarolineE!!!

'sa' User Map error

Hello all,

I'm having a problem with my sa account, we just migrated from sql server 2000 to sql server 2005 and 1 of our production applications is failing on the sa login.

I checked the 'User Map' setup to the 'sa' account and the databases that need accessing aren't mapped.

When I try to add the map, I get error "Cannot use the special principal 'sa'. (Microsoft SQL Server, Error: 15405)". I get the same error when trying to remove a user map for sa (there are a couple db's already mapped), and I don't remember how I mapped them in the first place, but it certainly isn't working now!

Thanks in advance,
102020

What is ther error that you get back and for waht operation do you get it back? Please also check the server error log and copy the relevant errors logged there.

Thanks
Laurentiu

|||I have fixed this already, thanks though.

The problem was that the db owner was the domain admin and not sa, so I couldn't change the mapping.

Thanks,
102020
|||

Hi,

I have the same problem. I am trying to add sa to several databases created by other users than sa. We are getting error 15405:cannot use the reserved user or role name 'SA'.Microsoft SQL-DMO (ODBC-SQL State:42000)

Is there no solution to this problem? We are using SQL Server 2005

Regards Caroline|||

I found the solution!! Quit simple: exec sp_changedbowner 'sa','true'.

Thanks anyway

|||

Hey,

I find out how to change it using Management Stadio Express:

Right click the database to show properties. then change the owner under files to sa

then you can map it under Logins .

|||Caroline, thanks for posting this - it really helped me out!
|||Thanks Caroline, I've been searching for hours on how to do this, I kept geeting same as as above posts.

'sa' User Map error

Hello all,

I'm having a problem with my sa account, we just migrated from sql server 2000 to sql server 2005 and 1 of our production applications is failing on the sa login.

I checked the 'User Map' setup to the 'sa' account and the databases that need accessing aren't mapped.

When I try to add the map, I get error "Cannot use the special principal 'sa'. (Microsoft SQL Server, Error: 15405)". I get the same error when trying to remove a user map for sa (there are a couple db's already mapped), and I don't remember how I mapped them in the first place, but it certainly isn't working now!

Thanks in advance,
102020

What is ther error that you get back and for waht operation do you get it back? Please also check the server error log and copy the relevant errors logged there.

Thanks
Laurentiu

|||I have fixed this already, thanks though.

The problem was that the db owner was the domain admin and not sa, so I couldn't change the mapping.

Thanks,
102020
|||

Hi,

I have the same problem. I am trying to add sa to several databases created by other users than sa. We are getting error 15405:cannot use the reserved user or role name 'SA'.Microsoft SQL-DMO (ODBC-SQL State:42000)

Is there no solution to this problem? We are using SQL Server 2005

Regards Caroline|||

I found the solution!! Quit simple: exec sp_changedbowner 'sa','true'.

Thanks anyway

|||

Hey,

I find out how to change it using Management Stadio Express:

Right click the database to show properties. then change the owner under files to sa

then you can map it under Logins .

|||Caroline, thanks for posting this - it really helped me out!
|||Thanks Caroline, I've been searching for hours on how to do this, I kept geeting same as as above posts.
|||I've been living without database diagrams for one of my databases for the best part of a year since inheriting it from a previous developer. Now I can finally see the relationships between the tables. Thanks CarolineE!!!

%Disk Time value over 100

Hello!
I have collected counters using perfmon from our Production SQL Server. I
have noticed that %Disk Time counter averaging 111 with maximum value being
2249 for particular drive. I was wondering how I detect whether this value
indicates a bottleneck. Drive in question is RAID 10. Is there conversion
formula I should use?
I know that value over 60 indicates a potential problem. Average Disk Queue
lenght is 1 which means to me that disk is OK. I am confused by high %Disk
Time value.
Any advice is appreciated,
Igor
Used 100 - %disk idle time to get your utilization.
"imarchenko" wrote:

> Hello!
> I have collected counters using perfmon from our Production SQL Server. I
> have noticed that %Disk Time counter averaging 111 with maximum value being
> 2249 for particular drive. I was wondering how I detect whether this value
> indicates a bottleneck. Drive in question is RAID 10. Is there conversion
> formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk Queue
> lenght is 1 which means to me that disk is OK. I am confused by high %Disk
> Time value.
>
> Any advice is appreciated,
> Igor
>
>
|||%Disk time is a useless counter on high-end IO systems. It was designed for
sequential command IO systems, not current SCSI systems that support Command
Tag Queuing or the new SATA Native Command Queuing options. These are the
device-level commands that support scatter-gather IO. Short description is
that they full-duplex command and response for IO. The device can queue a
large number of IO requests, sort them optimally, and respond to them
asychronously and asequentially. RAID subsystems further reduced the
effectiveness of this counter buy abstracting a large number of physical
devices into one logical device presented to the OS..
I use Physical disk | Disk Read Bytes/sec, Disk Write Bytes/sec, Disk
Reads/sec, Disk Writes/sec, and Disk Queue Length to determine if I am IO
bound. Of course, you need to know the maximum capabilities of your IO
subsystem to use these numbers effectively. But since you benchmarked the
IO system with IOMeter when you built the server that is an easy comparison.

Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"imarchenko" <igormarchenko@.hotmail.com> wrote in message
news:eKaIHNaqFHA.4044@.TK2MSFTNGP09.phx.gbl...
> Hello!
> I have collected counters using perfmon from our Production SQL Server.
> I have noticed that %Disk Time counter averaging 111 with maximum value
> being 2249 for particular drive. I was wondering how I detect whether this
> value indicates a bottleneck. Drive in question is RAID 10. Is there
> conversion formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk
> Queue lenght is 1 which means to me that disk is OK. I am confused by high
> %Disk Time value.
>
> Any advice is appreciated,
> Igor
>
|||Jeffrey,
Thanks a lot!
Igor
"Jeffrey K. Ericson" <JeffreyKEricson@.discussions.microsoft.com> wrote in
message news:22B02C35-4D78-47A1-B273-88ED2F5EDACA@.microsoft.com...[vbcol=seagreen]
> Used 100 - %disk idle time to get your utilization.
> "imarchenko" wrote:
|||Thanks, Geoff. I really appreciate your elaborate reply.
Igor
"Geoff N. Hiten" <sqlcraftsman@.gmail.com> wrote in message
news:eSP65saqFHA.272@.TK2MSFTNGP15.phx.gbl...
> %Disk time is a useless counter on high-end IO systems. It was designed
> for sequential command IO systems, not current SCSI systems that support
> Command Tag Queuing or the new SATA Native Command Queuing options. These
> are the device-level commands that support scatter-gather IO. Short
> description is that they full-duplex command and response for IO. The
> device can queue a large number of IO requests, sort them optimally, and
> respond to them asychronously and asequentially. RAID subsystems further
> reduced the effectiveness of this counter buy abstracting a large number
> of physical devices into one logical device presented to the OS..
> I use Physical disk | Disk Read Bytes/sec, Disk Write Bytes/sec, Disk
> Reads/sec, Disk Writes/sec, and Disk Queue Length to determine if I am IO
> bound. Of course, you need to know the maximum capabilities of your IO
> subsystem to use these numbers effectively. But since you benchmarked the
> IO system with IOMeter when you built the server that is an easy
> comparison.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
>
> "imarchenko" <igormarchenko@.hotmail.com> wrote in message
> news:eKaIHNaqFHA.4044@.TK2MSFTNGP09.phx.gbl...
>

%Disk Time value over 100

Hello!
I have collected counters using perfmon from our Production SQL Server. I
have noticed that %Disk Time counter averaging 111 with maximum value being
2249 for particular drive. I was wondering how I detect whether this value
indicates a bottleneck. Drive in question is RAID 10. Is there conversion
formula I should use?
I know that value over 60 indicates a potential problem. Average Disk Queue
lenght is 1 which means to me that disk is OK. I am confused by high %Disk
Time value.
Any advice is appreciated,
IgorUsed 100 - %disk idle time to get your utilization.
"imarchenko" wrote:

> Hello!
> I have collected counters using perfmon from our Production SQL Server.
I
> have noticed that %Disk Time counter averaging 111 with maximum value bein
g
> 2249 for particular drive. I was wondering how I detect whether this value
> indicates a bottleneck. Drive in question is RAID 10. Is there conversion
> formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk Queu
e
> lenght is 1 which means to me that disk is OK. I am confused by high %Disk
> Time value.
>
> Any advice is appreciated,
> Igor
>
>|||%Disk time is a useless counter on high-end IO systems. It was designed for
sequential command IO systems, not current SCSI systems that support Command
Tag Queuing or the new SATA Native Command Queuing options. These are the
device-level commands that support scatter-gather IO. Short description is
that they full-duplex command and response for IO. The device can queue a
large number of IO requests, sort them optimally, and respond to them
asychronously and asequentially. RAID subsystems further reduced the
effectiveness of this counter buy abstracting a large number of physical
devices into one logical device presented to the OS..
I use Physical disk | Disk Read Bytes/sec, Disk Write Bytes/sec, Disk
Reads/sec, Disk Writes/sec, and Disk Queue Length to determine if I am IO
bound. Of course, you need to know the maximum capabilities of your IO
subsystem to use these numbers effectively. But since you benchmarked the
IO system with IOMeter when you built the server that is an easy comparison.

Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"imarchenko" <igormarchenko@.hotmail.com> wrote in message
news:eKaIHNaqFHA.4044@.TK2MSFTNGP09.phx.gbl...
> Hello!
> I have collected counters using perfmon from our Production SQL Server.
> I have noticed that %Disk Time counter averaging 111 with maximum value
> being 2249 for particular drive. I was wondering how I detect whether this
> value indicates a bottleneck. Drive in question is RAID 10. Is there
> conversion formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk
> Queue lenght is 1 which means to me that disk is OK. I am confused by high
> %Disk Time value.
>
> Any advice is appreciated,
> Igor
>

%Disk Time value over 100

Hello!
I have collected counters using perfmon from our Production SQL Server. I
have noticed that %Disk Time counter averaging 111 with maximum value being
2249 for particular drive. I was wondering how I detect whether this value
indicates a bottleneck. Drive in question is RAID 10. Is there conversion
formula I should use?
I know that value over 60 indicates a potential problem. Average Disk Queue
lenght is 1 which means to me that disk is OK. I am confused by high %Disk
Time value.
Any advice is appreciated,
IgorUsed 100 - %disk idle time to get your utilization.
"imarchenko" wrote:
> Hello!
> I have collected counters using perfmon from our Production SQL Server. I
> have noticed that %Disk Time counter averaging 111 with maximum value being
> 2249 for particular drive. I was wondering how I detect whether this value
> indicates a bottleneck. Drive in question is RAID 10. Is there conversion
> formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk Queue
> lenght is 1 which means to me that disk is OK. I am confused by high %Disk
> Time value.
>
> Any advice is appreciated,
> Igor
>
>|||%Disk time is a useless counter on high-end IO systems. It was designed for
sequential command IO systems, not current SCSI systems that support Command
Tag Queuing or the new SATA Native Command Queuing options. These are the
device-level commands that support scatter-gather IO. Short description is
that they full-duplex command and response for IO. The device can queue a
large number of IO requests, sort them optimally, and respond to them
asychronously and asequentially. RAID subsystems further reduced the
effectiveness of this counter buy abstracting a large number of physical
devices into one logical device presented to the OS..
I use Physical disk | Disk Read Bytes/sec, Disk Write Bytes/sec, Disk
Reads/sec, Disk Writes/sec, and Disk Queue Length to determine if I am IO
bound. Of course, you need to know the maximum capabilities of your IO
subsystem to use these numbers effectively. But since you benchmarked the
IO system with IOMeter when you built the server that is an easy comparison.
:)
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"imarchenko" <igormarchenko@.hotmail.com> wrote in message
news:eKaIHNaqFHA.4044@.TK2MSFTNGP09.phx.gbl...
> Hello!
> I have collected counters using perfmon from our Production SQL Server.
> I have noticed that %Disk Time counter averaging 111 with maximum value
> being 2249 for particular drive. I was wondering how I detect whether this
> value indicates a bottleneck. Drive in question is RAID 10. Is there
> conversion formula I should use?
> I know that value over 60 indicates a potential problem. Average Disk
> Queue lenght is 1 which means to me that disk is OK. I am confused by high
> %Disk Time value.
>
> Any advice is appreciated,
> Igor
>|||Jeffrey,
Thanks a lot!
Igor
"Jeffrey K. Ericson" <JeffreyKEricson@.discussions.microsoft.com> wrote in
message news:22B02C35-4D78-47A1-B273-88ED2F5EDACA@.microsoft.com...
> Used 100 - %disk idle time to get your utilization.
> "imarchenko" wrote:
>> Hello!
>> I have collected counters using perfmon from our Production SQL
>> Server. I
>> have noticed that %Disk Time counter averaging 111 with maximum value
>> being
>> 2249 for particular drive. I was wondering how I detect whether this
>> value
>> indicates a bottleneck. Drive in question is RAID 10. Is there conversion
>> formula I should use?
>> I know that value over 60 indicates a potential problem. Average Disk
>> Queue
>> lenght is 1 which means to me that disk is OK. I am confused by high
>> %Disk
>> Time value.
>>
>> Any advice is appreciated,
>> Igor
>>|||Thanks, Geoff. I really appreciate your elaborate reply.
Igor
"Geoff N. Hiten" <sqlcraftsman@.gmail.com> wrote in message
news:eSP65saqFHA.272@.TK2MSFTNGP15.phx.gbl...
> %Disk time is a useless counter on high-end IO systems. It was designed
> for sequential command IO systems, not current SCSI systems that support
> Command Tag Queuing or the new SATA Native Command Queuing options. These
> are the device-level commands that support scatter-gather IO. Short
> description is that they full-duplex command and response for IO. The
> device can queue a large number of IO requests, sort them optimally, and
> respond to them asychronously and asequentially. RAID subsystems further
> reduced the effectiveness of this counter buy abstracting a large number
> of physical devices into one logical device presented to the OS..
> I use Physical disk | Disk Read Bytes/sec, Disk Write Bytes/sec, Disk
> Reads/sec, Disk Writes/sec, and Disk Queue Length to determine if I am IO
> bound. Of course, you need to know the maximum capabilities of your IO
> subsystem to use these numbers effectively. But since you benchmarked the
> IO system with IOMeter when you built the server that is an easy
> comparison. :)
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
>
> "imarchenko" <igormarchenko@.hotmail.com> wrote in message
> news:eKaIHNaqFHA.4044@.TK2MSFTNGP09.phx.gbl...
>> Hello!
>> I have collected counters using perfmon from our Production SQL Server.
>> I have noticed that %Disk Time counter averaging 111 with maximum value
>> being 2249 for particular drive. I was wondering how I detect whether
>> this value indicates a bottleneck. Drive in question is RAID 10. Is there
>> conversion formula I should use?
>> I know that value over 60 indicates a potential problem. Average Disk
>> Queue lenght is 1 which means to me that disk is OK. I am confused by
>> high %Disk Time value.
>>
>> Any advice is appreciated,
>> Igor
>>
>

Saturday, February 11, 2012

#temp tables

I have the following sproc that has been in production for severa months.
When attempting to set this sproc as my data source I get the following error.
There is an error in the query. Implicit conversion from data type
sql_variant to int is not allowed. Use the CONVERT function to run this query.
If I comment most of the sproc except for the create temp table and insert
into it...then it will work (which does me no good).
Please help! I've tried using global temp tables and even just using a
real table that's being populated...but I sitll get the same error.
CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date datetime = null, @.summary bit = 0, @.GroupVal char(1) as
declare @.LastMonth datetime
if @.date is null
set @.date = getdate()
set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
Cast(year(@.date) as varchar(4)) as datetime)
CREATE TABLE #tmpSalesCounts
(
DayCode tinyint,
SalesRepName varchar(30),
JobCount int,
MailCount int
)
IF @.GroupVal = 'C'
BEGIN
insert into #tmpSalesCounts
select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
sum(jobQuantity) as MailCount
from
(select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
JobQuantity
from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup where
SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
group by CSRName
END
ELSE
BEGIN
insert into #tmpSalesCounts
select 1 as DayCode, SalesRepName, count(*) as JobCount, sum(jobQuantity)
as MailCount
from
(select convert( varchar(14), OrderDate, 107) as OrderDate, SalesRepName,
JobQuantity
from vwJobs where salesrepid in (select SalesRepID from tblSalesRepGroup
where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
group by SalesRepName
END
IF @.Summary = 0
select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
isnull(MailCount, 0) as MailPieces
from #tmpSalesCounts t1
right join (select periodID, PeriodDesc, FirstName+' '+LastName as
SalesRepName from tblReportPeriods cross join tblUsers
where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
order by t2.SalesRepName, PeriodID
ELSE
select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
isnull(Sum(MailCount), 0) as MailPieces
from #tmpSalesCounts t1
right join (select periodID, PeriodDesc, FirstName+' '+LastName as
SalesRepName from tblReportPeriods cross join tblUsers
where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
group by PeriodID, PeriodDesc
order by PeriodID
GOFollowing up on my own message. It seems to have something to do with the
creation of the #temp table. I gave the credentials I'm using owner access
to the db, but that does not seem to help.
"FL Jim" wrote:
> I have the following sproc that has been in production for severa months.
> When attempting to set this sproc as my data source I get the following error.
> There is an error in the query. Implicit conversion from data type
> sql_variant to int is not allowed. Use the CONVERT function to run this query.
> If I comment most of the sproc except for the create temp table and insert
> into it...then it will work (which does me no good).
> Please help! I've tried using global temp tables and even just using a
> real table that's being populated...but I sitll get the same error.
>
> CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date datetime => null, @.summary bit = 0, @.GroupVal char(1) as
>
> declare @.LastMonth datetime
> if @.date is null
> set @.date = getdate()
> set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
> Cast(year(@.date) as varchar(4)) as datetime)
> CREATE TABLE #tmpSalesCounts
> (
> DayCode tinyint,
> SalesRepName varchar(30),
> JobCount int,
> MailCount int
> )
> IF @.GroupVal = 'C'
> BEGIN
> insert into #tmpSalesCounts
> select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
> sum(jobQuantity) as MailCount
> from
> (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
> JobQuantity
> from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup where
> SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> group by CSRName
> END
> ELSE
> BEGIN
> insert into #tmpSalesCounts
> select 1 as DayCode, SalesRepName, count(*) as JobCount, sum(jobQuantity)
> as MailCount
> from
> (select convert( varchar(14), OrderDate, 107) as OrderDate, SalesRepName,
> JobQuantity
> from vwJobs where salesrepid in (select SalesRepID from tblSalesRepGroup
> where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> group by SalesRepName
> END
>
> IF @.Summary = 0
> select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
> isnull(MailCount, 0) as MailPieces
> from #tmpSalesCounts t1
> right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> SalesRepName from tblReportPeriods cross join tblUsers
> where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> order by t2.SalesRepName, PeriodID
> ELSE
> select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
> isnull(Sum(MailCount), 0) as MailPieces
> from #tmpSalesCounts t1
> right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> SalesRepName from tblReportPeriods cross join tblUsers
> where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> group by PeriodID, PeriodDesc
> order by PeriodID
> GO
>
>|||Please answer the following question to enable me to help you.
1. Does it work from Query Analyzer?
2. Do you get this error from the data tab or when you try to view it?
3. Is this an error in development or in production?
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
>I have the following sproc that has been in production for severa months.
> When attempting to set this sproc as my data source I get the following
> error.
> There is an error in the query. Implicit conversion from data type
> sql_variant to int is not allowed. Use the CONVERT function to run this
> query.
> If I comment most of the sproc except for the create temp table and insert
> into it...then it will work (which does me no good).
> Please help! I've tried using global temp tables and even just using a
> real table that's being populated...but I sitll get the same error.
>
> CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date datetime => null, @.summary bit = 0, @.GroupVal char(1) as
>
> declare @.LastMonth datetime
> if @.date is null
> set @.date = getdate()
> set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
> Cast(year(@.date) as varchar(4)) as datetime)
> CREATE TABLE #tmpSalesCounts
> (
> DayCode tinyint,
> SalesRepName varchar(30),
> JobCount int,
> MailCount int
> )
> IF @.GroupVal = 'C'
> BEGIN
> insert into #tmpSalesCounts
> select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
> sum(jobQuantity) as MailCount
> from
> (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
> JobQuantity
> from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup where
> SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> group by CSRName
> END
> ELSE
> BEGIN
> insert into #tmpSalesCounts
> select 1 as DayCode, SalesRepName, count(*) as JobCount, sum(jobQuantity)
> as MailCount
> from
> (select convert( varchar(14), OrderDate, 107) as OrderDate, SalesRepName,
> JobQuantity
> from vwJobs where salesrepid in (select SalesRepID from tblSalesRepGroup
> where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> group by SalesRepName
> END
>
> IF @.Summary = 0
> select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
> isnull(MailCount, 0) as MailPieces
> from #tmpSalesCounts t1
> right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> SalesRepName from tblReportPeriods cross join tblUsers
> where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> order by t2.SalesRepName, PeriodID
> ELSE
> select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
> isnull(Sum(MailCount), 0) as MailPieces
> from #tmpSalesCounts t1
> right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> SalesRepName from tblReportPeriods cross join tblUsers
> where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> group by PeriodID, PeriodDesc
> order by PeriodID
> GO
>
>|||1) Yes
2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
datetime, @.summary, @.GroupVal when trying to connect to the data source
3) Development.
I found another related thread and have tried their suggestion. I connect
to a table in the db, just to get myself to the Data tab, then switch the
Commany Type to stored procedure and just enter the sproc name. It seems to
work this way. It's rather strange though, because all of my other reports I
was able to just enter the exec sproc (with parameter fields) in the Query
String box of the new report wizard. I guess it's just a quirk...I don't
know.
"Bruce L-C [MVP]" wrote:
> Please answer the following question to enable me to help you.
> 1. Does it work from Query Analyzer?
> 2. Do you get this error from the data tab or when you try to view it?
> 3. Is this an error in development or in production?
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
>
> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
> >I have the following sproc that has been in production for severa months.
> > When attempting to set this sproc as my data source I get the following
> > error.
> >
> > There is an error in the query. Implicit conversion from data type
> > sql_variant to int is not allowed. Use the CONVERT function to run this
> > query.
> >
> > If I comment most of the sproc except for the create temp table and insert
> > into it...then it will work (which does me no good).
> >
> > Please help! I've tried using global temp tables and even just using a
> > real table that's being populated...but I sitll get the same error.
> >
> >
> >
> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date datetime => > null, @.summary bit = 0, @.GroupVal char(1) as
> >
> >
> > declare @.LastMonth datetime
> >
> > if @.date is null
> > set @.date = getdate()
> >
> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
> > Cast(year(@.date) as varchar(4)) as datetime)
> >
> > CREATE TABLE #tmpSalesCounts
> > (
> > DayCode tinyint,
> > SalesRepName varchar(30),
> > JobCount int,
> > MailCount int
> > )
> >
> > IF @.GroupVal = 'C'
> > BEGIN
> > insert into #tmpSalesCounts
> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
> > sum(jobQuantity) as MailCount
> > from
> > (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
> > JobQuantity
> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup where
> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> >
> > group by CSRName
> >
> > END
> > ELSE
> > BEGIN
> > insert into #tmpSalesCounts
> > select 1 as DayCode, SalesRepName, count(*) as JobCount, sum(jobQuantity)
> > as MailCount
> > from
> > (select convert( varchar(14), OrderDate, 107) as OrderDate, SalesRepName,
> > JobQuantity
> > from vwJobs where salesrepid in (select SalesRepID from tblSalesRepGroup
> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> >
> > group by SalesRepName
> >
> > END
> >
> >
> > IF @.Summary = 0
> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
> > isnull(MailCount, 0) as MailPieces
> > from #tmpSalesCounts t1
> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> > SalesRepName from tblReportPeriods cross join tblUsers
> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> > order by t2.SalesRepName, PeriodID
> > ELSE
> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
> > isnull(Sum(MailCount), 0) as MailPieces
> > from #tmpSalesCounts t1
> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> > SalesRepName from tblReportPeriods cross join tblUsers
> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select SalesRepID
> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> > group by PeriodID, PeriodDesc
> > order by PeriodID
> > GO
> >
> >
> >
> >
>
>|||Good. Glad the problem is solved. I use SP a lot but I don't combine the SP
with the wizard. I tend to start with an empty report (add item instead of
add report). Create the dataset calling the SP. Then drop a table on the
layout and drag and drop fields.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
news:83876B6B-96E6-4BE9-8152-68DDF2D5FBA8@.microsoft.com...
> 1) Yes
> 2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
> datetime, @.summary, @.GroupVal when trying to connect to the data source
> 3) Development.
> I found another related thread and have tried their suggestion. I connect
> to a table in the db, just to get myself to the Data tab, then switch the
> Commany Type to stored procedure and just enter the sproc name. It seems
> to
> work this way. It's rather strange though, because all of my other
> reports I
> was able to just enter the exec sproc (with parameter fields) in the Query
> String box of the new report wizard. I guess it's just a quirk...I don't
> know.
> "Bruce L-C [MVP]" wrote:
>> Please answer the following question to enable me to help you.
>> 1. Does it work from Query Analyzer?
>> 2. Do you get this error from the data tab or when you try to view it?
>> 3. Is this an error in development or in production?
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>>
>> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
>> >I have the following sproc that has been in production for severa
>> >months.
>> > When attempting to set this sproc as my data source I get the following
>> > error.
>> >
>> > There is an error in the query. Implicit conversion from data type
>> > sql_variant to int is not allowed. Use the CONVERT function to run this
>> > query.
>> >
>> > If I comment most of the sproc except for the create temp table and
>> > insert
>> > into it...then it will work (which does me no good).
>> >
>> > Please help! I've tried using global temp tables and even just using
>> > a
>> > real table that's being populated...but I sitll get the same error.
>> >
>> >
>> >
>> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date
>> > datetime =>> > null, @.summary bit = 0, @.GroupVal char(1) as
>> >
>> >
>> > declare @.LastMonth datetime
>> >
>> > if @.date is null
>> > set @.date = getdate()
>> >
>> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
>> > Cast(year(@.date) as varchar(4)) as datetime)
>> >
>> > CREATE TABLE #tmpSalesCounts
>> > (
>> > DayCode tinyint,
>> > SalesRepName varchar(30),
>> > JobCount int,
>> > MailCount int
>> > )
>> >
>> > IF @.GroupVal = 'C'
>> > BEGIN
>> > insert into #tmpSalesCounts
>> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
>> > sum(jobQuantity) as MailCount
>> > from
>> > (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
>> > JobQuantity
>> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup
>> > where
>> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >
>> > group by CSRName
>> >
>> > END
>> > ELSE
>> > BEGIN
>> > insert into #tmpSalesCounts
>> > select 1 as DayCode, SalesRepName, count(*) as JobCount,
>> > sum(jobQuantity)
>> > as MailCount
>> > from
>> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
>> > SalesRepName,
>> > JobQuantity
>> > from vwJobs where salesrepid in (select SalesRepID from
>> > tblSalesRepGroup
>> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >
>> > group by SalesRepName
>> >
>> > END
>> >
>> >
>> > IF @.Summary = 0
>> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
>> > isnull(MailCount, 0) as MailPieces
>> > from #tmpSalesCounts t1
>> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> > SalesRepName from tblReportPeriods cross join tblUsers
>> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> > SalesRepID
>> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> > order by t2.SalesRepName, PeriodID
>> > ELSE
>> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
>> > isnull(Sum(MailCount), 0) as MailPieces
>> > from #tmpSalesCounts t1
>> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> > SalesRepName from tblReportPeriods cross join tblUsers
>> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> > SalesRepID
>> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> > group by PeriodID, PeriodDesc
>> > order by PeriodID
>> > GO
>> >
>> >
>> >
>> >
>>|||I thought it was fixed...I failed to realize that while my dataset returns
data in the data tab, I can't see any fields for the sproc in the layout view
for the report. Basically, it's attached and returning data in the data
tab, but the fields are accesible in the report. Back to the drawing board.
I don't know if there's a way to refresh this or not...
"Bruce L-C [MVP]" wrote:
> Good. Glad the problem is solved. I use SP a lot but I don't combine the SP
> with the wizard. I tend to start with an empty report (add item instead of
> add report). Create the dataset calling the SP. Then drop a table on the
> layout and drag and drop fields.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
> news:83876B6B-96E6-4BE9-8152-68DDF2D5FBA8@.microsoft.com...
> > 1) Yes
> > 2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
> > datetime, @.summary, @.GroupVal when trying to connect to the data source
> > 3) Development.
> >
> > I found another related thread and have tried their suggestion. I connect
> > to a table in the db, just to get myself to the Data tab, then switch the
> > Commany Type to stored procedure and just enter the sproc name. It seems
> > to
> > work this way. It's rather strange though, because all of my other
> > reports I
> > was able to just enter the exec sproc (with parameter fields) in the Query
> > String box of the new report wizard. I guess it's just a quirk...I don't
> > know.
> >
> > "Bruce L-C [MVP]" wrote:
> >
> >> Please answer the following question to enable me to help you.
> >> 1. Does it work from Query Analyzer?
> >> 2. Do you get this error from the data tab or when you try to view it?
> >> 3. Is this an error in development or in production?
> >>
> >>
> >> --
> >> Bruce Loehle-Conger
> >> MVP SQL Server Reporting Services
> >>
> >>
> >> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
> >> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
> >> >I have the following sproc that has been in production for severa
> >> >months.
> >> > When attempting to set this sproc as my data source I get the following
> >> > error.
> >> >
> >> > There is an error in the query. Implicit conversion from data type
> >> > sql_variant to int is not allowed. Use the CONVERT function to run this
> >> > query.
> >> >
> >> > If I comment most of the sproc except for the create temp table and
> >> > insert
> >> > into it...then it will work (which does me no good).
> >> >
> >> > Please help! I've tried using global temp tables and even just using
> >> > a
> >> > real table that's being populated...but I sitll get the same error.
> >> >
> >> >
> >> >
> >> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date
> >> > datetime => >> > null, @.summary bit = 0, @.GroupVal char(1) as
> >> >
> >> >
> >> > declare @.LastMonth datetime
> >> >
> >> > if @.date is null
> >> > set @.date = getdate()
> >> >
> >> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
> >> > Cast(year(@.date) as varchar(4)) as datetime)
> >> >
> >> > CREATE TABLE #tmpSalesCounts
> >> > (
> >> > DayCode tinyint,
> >> > SalesRepName varchar(30),
> >> > JobCount int,
> >> > MailCount int
> >> > )
> >> >
> >> > IF @.GroupVal = 'C'
> >> > BEGIN
> >> > insert into #tmpSalesCounts
> >> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
> >> > sum(jobQuantity) as MailCount
> >> > from
> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
> >> > JobQuantity
> >> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup
> >> > where
> >> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> >> >
> >> > group by CSRName
> >> >
> >> > END
> >> > ELSE
> >> > BEGIN
> >> > insert into #tmpSalesCounts
> >> > select 1 as DayCode, SalesRepName, count(*) as JobCount,
> >> > sum(jobQuantity)
> >> > as MailCount
> >> > from
> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
> >> > SalesRepName,
> >> > JobQuantity
> >> > from vwJobs where salesrepid in (select SalesRepID from
> >> > tblSalesRepGroup
> >> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
> >> >
> >> > group by SalesRepName
> >> >
> >> > END
> >> >
> >> >
> >> > IF @.Summary = 0
> >> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0) Jobs,
> >> > isnull(MailCount, 0) as MailPieces
> >> > from #tmpSalesCounts t1
> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> >> > SalesRepName from tblReportPeriods cross join tblUsers
> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
> >> > SalesRepID
> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> >> > order by t2.SalesRepName, PeriodID
> >> > ELSE
> >> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
> >> > isnull(Sum(MailCount), 0) as MailPieces
> >> > from #tmpSalesCounts t1
> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
> >> > SalesRepName from tblReportPeriods cross join tblUsers
> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
> >> > SalesRepID
> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
> >> > group by PeriodID, PeriodDesc
> >> > order by PeriodID
> >> > GO
> >> >
> >> >
> >> >
> >> >
> >>
> >>
> >>
>
>|||Yes. Click on the refresh fields button to the right of the ... in the data
tab (it looks like the refresh button for IE).
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
news:4FBB92FE-95CE-4FC5-9918-C9B26081895B@.microsoft.com...
>I thought it was fixed...I failed to realize that while my dataset returns
> data in the data tab, I can't see any fields for the sproc in the layout
> view
> for the report. Basically, it's attached and returning data in the data
> tab, but the fields are accesible in the report. Back to the drawing
> board.
> I don't know if there's a way to refresh this or not...
> "Bruce L-C [MVP]" wrote:
>> Good. Glad the problem is solved. I use SP a lot but I don't combine the
>> SP
>> with the wizard. I tend to start with an empty report (add item instead
>> of
>> add report). Create the dataset calling the SP. Then drop a table on the
>> layout and drag and drop fields.
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> news:83876B6B-96E6-4BE9-8152-68DDF2D5FBA8@.microsoft.com...
>> > 1) Yes
>> > 2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
>> > datetime, @.summary, @.GroupVal when trying to connect to the data source
>> > 3) Development.
>> >
>> > I found another related thread and have tried their suggestion. I
>> > connect
>> > to a table in the db, just to get myself to the Data tab, then switch
>> > the
>> > Commany Type to stored procedure and just enter the sproc name. It
>> > seems
>> > to
>> > work this way. It's rather strange though, because all of my other
>> > reports I
>> > was able to just enter the exec sproc (with parameter fields) in the
>> > Query
>> > String box of the new report wizard. I guess it's just a quirk...I
>> > don't
>> > know.
>> >
>> > "Bruce L-C [MVP]" wrote:
>> >
>> >> Please answer the following question to enable me to help you.
>> >> 1. Does it work from Query Analyzer?
>> >> 2. Do you get this error from the data tab or when you try to view it?
>> >> 3. Is this an error in development or in production?
>> >>
>> >>
>> >> --
>> >> Bruce Loehle-Conger
>> >> MVP SQL Server Reporting Services
>> >>
>> >>
>> >> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> >> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
>> >> >I have the following sproc that has been in production for severa
>> >> >months.
>> >> > When attempting to set this sproc as my data source I get the
>> >> > following
>> >> > error.
>> >> >
>> >> > There is an error in the query. Implicit conversion from data type
>> >> > sql_variant to int is not allowed. Use the CONVERT function to run
>> >> > this
>> >> > query.
>> >> >
>> >> > If I comment most of the sproc except for the create temp table and
>> >> > insert
>> >> > into it...then it will work (which does me no good).
>> >> >
>> >> > Please help! I've tried using global temp tables and even just
>> >> > using
>> >> > a
>> >> > real table that's being populated...but I sitll get the same error.
>> >> >
>> >> >
>> >> >
>> >> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date
>> >> > datetime =>> >> > null, @.summary bit = 0, @.GroupVal char(1) as
>> >> >
>> >> >
>> >> > declare @.LastMonth datetime
>> >> >
>> >> > if @.date is null
>> >> > set @.date = getdate()
>> >> >
>> >> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
>> >> > Cast(year(@.date) as varchar(4)) as datetime)
>> >> >
>> >> > CREATE TABLE #tmpSalesCounts
>> >> > (
>> >> > DayCode tinyint,
>> >> > SalesRepName varchar(30),
>> >> > JobCount int,
>> >> > MailCount int
>> >> > )
>> >> >
>> >> > IF @.GroupVal = 'C'
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity) as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
>> >> > JobQuantity
>> >> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup
>> >> > where
>> >> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by CSRName
>> >> >
>> >> > END
>> >> > ELSE
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity)
>> >> > as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
>> >> > SalesRepName,
>> >> > JobQuantity
>> >> > from vwJobs where salesrepid in (select SalesRepID from
>> >> > tblSalesRepGroup
>> >> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by SalesRepName
>> >> >
>> >> > END
>> >> >
>> >> >
>> >> > IF @.Summary = 0
>> >> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0)
>> >> > Jobs,
>> >> > isnull(MailCount, 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> >> > order by t2.SalesRepName, PeriodID
>> >> > ELSE
>> >> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
>> >> > isnull(Sum(MailCount), 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> >> > group by PeriodID, PeriodDesc
>> >> > order by PeriodID
>> >> > GO
>> >> >
>> >> >
>> >> >
>> >> >
>> >>
>> >>
>> >>
>>|||I recall way back when RS was in beta that SET NOCOUNT ON would often fix
this soft of problem, the code you gave in your sproc did not have this I
wonder if you could try adding this and see.
Regards
K Duncan
"FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
news:4FBB92FE-95CE-4FC5-9918-C9B26081895B@.microsoft.com...
>I thought it was fixed...I failed to realize that while my dataset returns
> data in the data tab, I can't see any fields for the sproc in the layout
> view
> for the report. Basically, it's attached and returning data in the data
> tab, but the fields are accesible in the report. Back to the drawing
> board.
> I don't know if there's a way to refresh this or not...
> "Bruce L-C [MVP]" wrote:
>> Good. Glad the problem is solved. I use SP a lot but I don't combine the
>> SP
>> with the wizard. I tend to start with an empty report (add item instead
>> of
>> add report). Create the dataset calling the SP. Then drop a table on the
>> layout and drag and drop fields.
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> news:83876B6B-96E6-4BE9-8152-68DDF2D5FBA8@.microsoft.com...
>> > 1) Yes
>> > 2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
>> > datetime, @.summary, @.GroupVal when trying to connect to the data source
>> > 3) Development.
>> >
>> > I found another related thread and have tried their suggestion. I
>> > connect
>> > to a table in the db, just to get myself to the Data tab, then switch
>> > the
>> > Commany Type to stored procedure and just enter the sproc name. It
>> > seems
>> > to
>> > work this way. It's rather strange though, because all of my other
>> > reports I
>> > was able to just enter the exec sproc (with parameter fields) in the
>> > Query
>> > String box of the new report wizard. I guess it's just a quirk...I
>> > don't
>> > know.
>> >
>> > "Bruce L-C [MVP]" wrote:
>> >
>> >> Please answer the following question to enable me to help you.
>> >> 1. Does it work from Query Analyzer?
>> >> 2. Do you get this error from the data tab or when you try to view it?
>> >> 3. Is this an error in development or in production?
>> >>
>> >>
>> >> --
>> >> Bruce Loehle-Conger
>> >> MVP SQL Server Reporting Services
>> >>
>> >>
>> >> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> >> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
>> >> >I have the following sproc that has been in production for severa
>> >> >months.
>> >> > When attempting to set this sproc as my data source I get the
>> >> > following
>> >> > error.
>> >> >
>> >> > There is an error in the query. Implicit conversion from data type
>> >> > sql_variant to int is not allowed. Use the CONVERT function to run
>> >> > this
>> >> > query.
>> >> >
>> >> > If I comment most of the sproc except for the create temp table and
>> >> > insert
>> >> > into it...then it will work (which does me no good).
>> >> >
>> >> > Please help! I've tried using global temp tables and even just
>> >> > using
>> >> > a
>> >> > real table that's being populated...but I sitll get the same error.
>> >> >
>> >> >
>> >> >
>> >> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date
>> >> > datetime =>> >> > null, @.summary bit = 0, @.GroupVal char(1) as
>> >> >
>> >> >
>> >> > declare @.LastMonth datetime
>> >> >
>> >> > if @.date is null
>> >> > set @.date = getdate()
>> >> >
>> >> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
>> >> > Cast(year(@.date) as varchar(4)) as datetime)
>> >> >
>> >> > CREATE TABLE #tmpSalesCounts
>> >> > (
>> >> > DayCode tinyint,
>> >> > SalesRepName varchar(30),
>> >> > JobCount int,
>> >> > MailCount int
>> >> > )
>> >> >
>> >> > IF @.GroupVal = 'C'
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity) as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate, CSRName,
>> >> > JobQuantity
>> >> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup
>> >> > where
>> >> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by CSRName
>> >> >
>> >> > END
>> >> > ELSE
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity)
>> >> > as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
>> >> > SalesRepName,
>> >> > JobQuantity
>> >> > from vwJobs where salesrepid in (select SalesRepID from
>> >> > tblSalesRepGroup
>> >> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by SalesRepName
>> >> >
>> >> > END
>> >> >
>> >> >
>> >> > IF @.Summary = 0
>> >> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0)
>> >> > Jobs,
>> >> > isnull(MailCount, 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> >> > order by t2.SalesRepName, PeriodID
>> >> > ELSE
>> >> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
>> >> > isnull(Sum(MailCount), 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname = t2.salesrepname
>> >> > group by PeriodID, PeriodDesc
>> >> > order by PeriodID
>> >> > GO
>> >> >
>> >> >
>> >> >
>> >> >
>> >>
>> >>
>> >>
>>|||If you can return data in the data tab then you will be fine. All you need
to do is click on the (not very discoverable) refresh fields button. It is
to the right of the ..., looks like the refresh button for IE.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"K Duncan" <keith@.sourcearray.com> wrote in message
news:%231xTouL1FHA.2792@.tk2msftngp13.phx.gbl...
>I recall way back when RS was in beta that SET NOCOUNT ON would often fix
>this soft of problem, the code you gave in your sproc did not have this I
>wonder if you could try adding this and see.
> Regards
> K Duncan
>
> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
> news:4FBB92FE-95CE-4FC5-9918-C9B26081895B@.microsoft.com...
>>I thought it was fixed...I failed to realize that while my dataset returns
>> data in the data tab, I can't see any fields for the sproc in the layout
>> view
>> for the report. Basically, it's attached and returning data in the data
>> tab, but the fields are accesible in the report. Back to the drawing
>> board.
>> I don't know if there's a way to refresh this or not...
>> "Bruce L-C [MVP]" wrote:
>> Good. Glad the problem is solved. I use SP a lot but I don't combine the
>> SP
>> with the wizard. I tend to start with an empty report (add item instead
>> of
>> add report). Create the dataset calling the SP. Then drop a table on the
>> layout and drag and drop fields.
>>
>> --
>> Bruce Loehle-Conger
>> MVP SQL Server Reporting Services
>> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> news:83876B6B-96E6-4BE9-8152-68DDF2D5FBA8@.microsoft.com...
>> > 1) Yes
>> > 2) I was entered in Exec spReports_DailySalesSummary2 @.GroupID, @.date
>> > datetime, @.summary, @.GroupVal when trying to connect to the data
>> > source
>> > 3) Development.
>> >
>> > I found another related thread and have tried their suggestion. I
>> > connect
>> > to a table in the db, just to get myself to the Data tab, then switch
>> > the
>> > Commany Type to stored procedure and just enter the sproc name. It
>> > seems
>> > to
>> > work this way. It's rather strange though, because all of my other
>> > reports I
>> > was able to just enter the exec sproc (with parameter fields) in the
>> > Query
>> > String box of the new report wizard. I guess it's just a quirk...I
>> > don't
>> > know.
>> >
>> > "Bruce L-C [MVP]" wrote:
>> >
>> >> Please answer the following question to enable me to help you.
>> >> 1. Does it work from Query Analyzer?
>> >> 2. Do you get this error from the data tab or when you try to view
>> >> it?
>> >> 3. Is this an error in development or in production?
>> >>
>> >>
>> >> --
>> >> Bruce Loehle-Conger
>> >> MVP SQL Server Reporting Services
>> >>
>> >>
>> >> "FL Jim" <FLJim@.discussions.microsoft.com> wrote in message
>> >> news:A4A39F90-9F48-446E-99DE-E73E9D0A512E@.microsoft.com...
>> >> >I have the following sproc that has been in production for severa
>> >> >months.
>> >> > When attempting to set this sproc as my data source I get the
>> >> > following
>> >> > error.
>> >> >
>> >> > There is an error in the query. Implicit conversion from data type
>> >> > sql_variant to int is not allowed. Use the CONVERT function to run
>> >> > this
>> >> > query.
>> >> >
>> >> > If I comment most of the sproc except for the create temp table and
>> >> > insert
>> >> > into it...then it will work (which does me no good).
>> >> >
>> >> > Please help! I've tried using global temp tables and even just
>> >> > using
>> >> > a
>> >> > real table that's being populated...but I sitll get the same error.
>> >> >
>> >> >
>> >> >
>> >> > CREATE proc dbo.spReports_DailySalesSummary @.GroupID int, @.date
>> >> > datetime =>> >> > null, @.summary bit = 0, @.GroupVal char(1) as
>> >> >
>> >> >
>> >> > declare @.LastMonth datetime
>> >> >
>> >> > if @.date is null
>> >> > set @.date = getdate()
>> >> >
>> >> > set @.LastMonth = Cast( Cast(Month(@.date) as varchar(2)) + '/1/' +
>> >> > Cast(year(@.date) as varchar(4)) as datetime)
>> >> >
>> >> > CREATE TABLE #tmpSalesCounts
>> >> > (
>> >> > DayCode tinyint,
>> >> > SalesRepName varchar(30),
>> >> > JobCount int,
>> >> > MailCount int
>> >> > )
>> >> >
>> >> > IF @.GroupVal = 'C'
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, CSRName as SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity) as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
>> >> > CSRName,
>> >> > JobQuantity
>> >> > from vwJobs where CSRid in (select SalesRepID from tblSalesRepGroup
>> >> > where
>> >> > SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by CSRName
>> >> >
>> >> > END
>> >> > ELSE
>> >> > BEGIN
>> >> > insert into #tmpSalesCounts
>> >> > select 1 as DayCode, SalesRepName, count(*) as JobCount,
>> >> > sum(jobQuantity)
>> >> > as MailCount
>> >> > from
>> >> > (select convert( varchar(14), OrderDate, 107) as OrderDate,
>> >> > SalesRepName,
>> >> > JobQuantity
>> >> > from vwJobs where salesrepid in (select SalesRepID from
>> >> > tblSalesRepGroup
>> >> > where SalesGroupID = @.GroupID ) and orderdate =@.date ) t1
>> >> >
>> >> > group by SalesRepName
>> >> >
>> >> > END
>> >> >
>> >> >
>> >> > IF @.Summary = 0
>> >> > select t2.SalesRepName, PeriodID, PeriodDesc, isnull(JobCount, 0)
>> >> > Jobs,
>> >> > isnull(MailCount, 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname =>> >> > t2.salesrepname
>> >> > order by t2.SalesRepName, PeriodID
>> >> > ELSE
>> >> > select PeriodID, PeriodDesc, isnull(sum(JobCount), 0) Jobs,
>> >> > isnull(Sum(MailCount), 0) as MailPieces
>> >> > from #tmpSalesCounts t1
>> >> > right join (select periodID, PeriodDesc, FirstName+' '+LastName as
>> >> > SalesRepName from tblReportPeriods cross join tblUsers
>> >> > where Reptype like '%' + @.GroupVal + '%' and UserId in (select
>> >> > SalesRepID
>> >> > from tblSalesRepGroup where SalesGroupID = @.GroupID ) )
>> >> > t2 on t1.Daycode = t2.PeriodID and t1.SalesRepname =>> >> > t2.salesrepname
>> >> > group by PeriodID, PeriodDesc
>> >> > order by PeriodID
>> >> > GO
>> >> >
>> >> >
>> >> >
>> >> >
>> >>
>> >>
>> >>
>>
>