Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Thursday, March 29, 2012

Cannot Export to Excel with Huge Data

Hi All!

I have an Issue.

I am calling a rdl file through the Url and i am passing the Format=Excel in the Url.

Eg. http://harinarayana/ReportServer/......&Format=Excel&...........

If the data is around more than some 20000 records, its not able to export and

a Error like "The Service is not available " is being displayed.

Does anyone have any solution for scenarios like this? It would be of great help to me.

Regards

Hari

This should be the answer you're looking for:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=22652&SiteID=1

Hopefully SQL 2008 will offer broader memory limitations.

Tuesday, March 27, 2012

Cannot empty a data file to another using DBCC SHRINKFILE EMPTYFILE

Hi,
A month ago, we were running out of disk space on a drive that stores
the data file for a SQL Server 7 database. So I added a new drive and
created a new data file on the primary filegroup and restricted the
growth of the first data file.
We have a new server now with heaps of space and I want to merge the two
data files into one but am running into huge issues. The reason why I
want to do this is to eliminate the possibilty that the creation of the
new data file is why our database has started to run a bit slower.
When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
result "DBCC execution completed. If DBCC printed error messages,
contact your system administrator." within a matter of seconds... the
data2 file is still sitting on 11GB and the data1 file is on 28GB
(restricted file growth)... I noticed that the log file grew
significantly and the data2 file emptied out a little but it was still
at 11GB...
I have tried doing log backups, database backups then shrinking again
but still nothing...
If anyone knows how to get rid of the second primary data file it would
be MUCH appreciated...
Regards,
Auday Alwash
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!
What service pack level do you have? There was a bug in SQL 7.0 with DBCC
SHRINKFILE with the EMPTYFILE option where heaps (tables with no clustered
index) & text pages were not always emptied properly - it was fixed in SP3.
For more info see: http://support.microsoft.com/kb/279511/EN-US/
HTH.
Cheers,
Mike
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!
|||Hi Mike,
SQL Server is on Service Pack 3 already...
Can you think of anything else that could be causing this?
Regards,
Auday
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!
|||Auday
It removes the file which was emtied.
CREATE DATABASE test
GO
ALTER DATABASE test
ADD FILE
( NAME = ww,
FILENAME = 'D:\wwdat1.ndf',
SIZE = 5MB,
MAXSIZE = 100MB,
FILEGROWTH = 5MB)
create table test..test(id int identity) on [primary]
--Performed many many inserts here.
dbcc shrinkfile(ww,EMPTYFILE)
GO
ALTER DATABASE test REMOVE FILE ww
select * from test.dbo.sysfiles
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:OBXOVjUzEHA.1452@.TK2MSFTNGP11.phx.gbl...
> Hi Mike,
> SQL Server is on Service Pack 3 already...
> Can you think of anything else that could be causing this?
> Regards,
> Auday
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!
|||Make sure there is no one using data in the file... I can imagine that would
prevent the data movement...
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!

Cannot empty a data file to another using DBCC SHRINKFILE EMPTYFILE

Hi,
A month ago, we were running out of disk space on a drive that stores
the data file for a SQL Server 7 database. So I added a new drive and
created a new data file on the primary filegroup and restricted the
growth of the first data file.
We have a new server now with heaps of space and I want to merge the two
data files into one but am running into huge issues. The reason why I
want to do this is to eliminate the possibilty that the creation of the
new data file is why our database has started to run a bit slower.
When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
result "DBCC execution completed. If DBCC printed error messages,
contact your system administrator." within a matter of seconds... the
data2 file is still sitting on 11GB and the data1 file is on 28GB
(restricted file growth)... I noticed that the log file grew
significantly and the data2 file emptied out a little but it was still
at 11GB...
I have tried doing log backups, database backups then shrinking again
but still nothing...
If anyone knows how to get rid of the second primary data file it would
be MUCH appreciated...
Regards,
Auday Alwash
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!What service pack level do you have? There was a bug in SQL 7.0 with DBCC
SHRINKFILE with the EMPTYFILE option where heaps (tables with no clustered
index) & text pages were not always emptied properly - it was fixed in SP3.
For more info see: http://support.microsoft.com/kb/279511/EN-US/
HTH.
Cheers,
Mike
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!|||Hi Mike,
SQL Server is on Service Pack 3 already...
Can you think of anything else that could be causing this?
Regards,
Auday
*** Sent via Developersdex http://www.codecomments.com ***
Don't just participate in USENET...get rewarded for it!|||Auday
It removes the file which was emtied.
CREATE DATABASE test
GO
ALTER DATABASE test
ADD FILE
( NAME = ww,
FILENAME = 'D:\wwdat1.ndf',
SIZE = 5MB,
MAXSIZE = 100MB,
FILEGROWTH = 5MB)
create table test..test(id int identity) on [primary]
--Performed many many inserts here.
dbcc shrinkfile(ww,EMPTYFILE)
GO
ALTER DATABASE test REMOVE FILE ww
select * from test.dbo.sysfiles
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:OBXOVjUzEHA.1452@.TK2MSFTNGP11.phx.gbl...
> Hi Mike,
> SQL Server is on Service Pack 3 already...
> Can you think of anything else that could be causing this?
> Regards,
> Auday
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!|||Make sure there is no one using data in the file... I can imagine that would
prevent the data movement...
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.codecomments.com ***
> Don't just participate in USENET...get rewarded for it!

Cannot empty a data file to another using DBCC SHRINKFILE EMPTYFILE

Hi,
A month ago, we were running out of disk space on a drive that stores
the data file for a SQL Server 7 database. So I added a new drive and
created a new data file on the primary filegroup and restricted the
growth of the first data file.
We have a new server now with heaps of space and I want to merge the two
data files into one but am running into huge issues. The reason why I
want to do this is to eliminate the possibilty that the creation of the
new data file is why our database has started to run a bit slower.
When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
result "DBCC execution completed. If DBCC printed error messages,
contact your system administrator." within a matter of seconds... the
data2 file is still sitting on 11GB and the data1 file is on 28GB
(restricted file growth)... I noticed that the log file grew
significantly and the data2 file emptied out a little but it was still
at 11GB...
I have tried doing log backups, database backups then shrinking again
but still nothing...
If anyone knows how to get rid of the second primary data file it would
be MUCH appreciated...
Regards,
Auday Alwash
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!What service pack level do you have? There was a bug in SQL 7.0 with DBCC
SHRINKFILE with the EMPTYFILE option where heaps (tables with no clustered
index) & text pages were not always emptied properly - it was fixed in SP3.
For more info see: http://support.microsoft.com/kb/279511/EN-US/
HTH.
--
Cheers,
Mike
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!|||Make sure there is no one using data in the file... I can imagine that would
prevent the data movement...
--
Wayne Snyder, MCDBA, SQL Server MVP
Mariner, Charlotte, NC
www.mariner-usa.com
(Please respond only to the newsgroups.)
I support the Professional Association of SQL Server (PASS) and it's
community of SQL Server professionals.
www.sqlpass.org
"Auday Alwash" <odies@.hotmail.com> wrote in message
news:eN7xN2RzEHA.3408@.tk2msftngp13.phx.gbl...
> Hi,
> A month ago, we were running out of disk space on a drive that stores
> the data file for a SQL Server 7 database. So I added a new drive and
> created a new data file on the primary filegroup and restricted the
> growth of the first data file.
> We have a new server now with heaps of space and I want to merge the two
> data files into one but am running into huge issues. The reason why I
> want to do this is to eliminate the possibilty that the creation of the
> new data file is why our database has started to run a bit slower.
> When I try to execute the DBCC SHRINKFILE data2,EMPTYFILE) I get the the
> result "DBCC execution completed. If DBCC printed error messages,
> contact your system administrator." within a matter of seconds... the
> data2 file is still sitting on 11GB and the data1 file is on 28GB
> (restricted file growth)... I noticed that the log file grew
> significantly and the data2 file emptied out a little but it was still
> at 11GB...
> I have tried doing log backups, database backups then shrinking again
> but still nothing...
> If anyone knows how to get rid of the second primary data file it would
> be MUCH appreciated...
> Regards,
> Auday Alwash
>
>
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!

Sunday, March 25, 2012

cannot drop mdf file

Hi All,
I cannot drop a mdf file that don't have a database but is in data
directory,.
Can you help-me ?
I'am using sql server 2005.
TIA
Yeah!!!
shift+delete is the solution.
Please ignore this!!!
"Roger" <roger@.rnospam.com> escreveu na mensagem
news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
> Hi All,
>
> I cannot drop a mdf file that don't have a database but is in data
> directory,.
> Can you help-me ?
> I'am using sql server 2005.
> TIA
>
>
|||Shift + Delete would just delete the file without any warning\confirmation.
That's only the difference between Delete and Shift + Delete.
File must be using by another process somehow when you were trying to delete
the file and when you tried the Delete with Shift it was not being used so
it was deleted I believe.
Ekrem nsoy
"Roger" <roger@.rnospam.com> wrote in message
news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
> Yeah!!!
> shift+delete is the solution.
> Please ignore this!!!
>
> "Roger" <roger@.rnospam.com> escreveu na mensagem
> news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
>
|||Actually, the biggest difference between delete and shift+delete is that
shift+delete
doesn't send files to the recycle bin.
"Ekrem ?nsoy" wrote:

> Shift + Delete would just delete the file without any warning\confirmation.
> That's only the difference between Delete and Shift + Delete.
> File must be using by another process somehow when you were trying to delete
> the file and when you tried the Delete with Shift it was not being used so
> it was deleted I believe.
> --
> Ekrem ?nsoy
>
> "Roger" <roger@.rnospam.com> wrote in message
> news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
>

cannot drop mdf file

Hi All,
I cannot drop a mdf file that don't have a database but is in data
directory,.
Can you help-me ?
I'am using sql server 2005.
TIAYeah!!!
shift+delete is the solution.
Please ignore this!!!
"Roger" <roger@.rnospam.com> escreveu na mensagem
news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
> Hi All,
>
> I cannot drop a mdf file that don't have a database but is in data
> directory,.
> Can you help-me ?
> I'am using sql server 2005.
> TIA
>
>|||Shift + Delete would just delete the file without any warning\confirmation.
That's only the difference between Delete and Shift + Delete.
File must be using by another process somehow when you were trying to delete
the file and when you tried the Delete with Shift it was not being used so
it was deleted I believe.
--
Ekrem Önsoy
"Roger" <roger@.rnospam.com> wrote in message
news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
> Yeah!!!
> shift+delete is the solution.
> Please ignore this!!!
>
> "Roger" <roger@.rnospam.com> escreveu na mensagem
> news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
>> Hi All,
>>
>> I cannot drop a mdf file that don't have a database but is in data
>> directory,.
>> Can you help-me ?
>> I'am using sql server 2005.
>> TIA
>>
>|||Actually, the biggest difference between delete and shift+delete is that
shift+delete
doesn't send files to the recycle bin.
"Ekrem Ã?nsoy" wrote:
> Shift + Delete would just delete the file without any warning\confirmation.
> That's only the difference between Delete and Shift + Delete.
> File must be using by another process somehow when you were trying to delete
> the file and when you tried the Delete with Shift it was not being used so
> it was deleted I believe.
> --
> Ekrem Ã?nsoy
>
> "Roger" <roger@.rnospam.com> wrote in message
> news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
> > Yeah!!!
> >
> > shift+delete is the solution.
> >
> > Please ignore this!!!
> >
> >
> > "Roger" <roger@.rnospam.com> escreveu na mensagem
> > news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
> >> Hi All,
> >>
> >>
> >> I cannot drop a mdf file that don't have a database but is in data
> >> directory,.
> >>
> >> Can you help-me ?
> >> I'am using sql server 2005.
> >>
> >> TIA
> >>
> >>
> >>
> >
> >
>

cannot drop mdf file

Hi All,
I cannot drop a mdf file that don't have a database but is in data
directory,.
Can you help-me ?
I'am using sql server 2005.
TIAYeah!!!
shift+delete is the solution.
Please ignore this!!!
"Roger" <roger@.rnospam.com> escreveu na mensagem
news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
> Hi All,
>
> I cannot drop a mdf file that don't have a database but is in data
> directory,.
> Can you help-me ?
> I'am using sql server 2005.
> TIA
>
>|||Shift + Delete would just delete the file without any warning\confirmation.
That's only the difference between Delete and Shift + Delete.
File must be using by another process somehow when you were trying to delete
the file and when you tried the Delete with Shift it was not being used so
it was deleted I believe.
Ekrem nsoy
"Roger" <roger@.rnospam.com> wrote in message
news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
> Yeah!!!
> shift+delete is the solution.
> Please ignore this!!!
>
> "Roger" <roger@.rnospam.com> escreveu na mensagem
> news:%23lOqzGHLIHA.4684@.TK2MSFTNGP06.phx.gbl...
>|||Actually, the biggest difference between delete and shift+delete is that
shift+delete
doesn't send files to the recycle bin.
"Ekrem ?nsoy" wrote:

> Shift + Delete would just delete the file without any warning\confirmation
.
> That's only the difference between Delete and Shift + Delete.
> File must be using by another process somehow when you were trying to dele
te
> the file and when you tried the Delete with Shift it was not being used so
> it was deleted I believe.
> --
> Ekrem ?nsoy
>
> "Roger" <roger@.rnospam.com> wrote in message
> news:%23dkmWQHLIHA.1188@.TK2MSFTNGP04.phx.gbl...
>

Thursday, March 22, 2012

cannot delete user & shrink log file

Hello... I'm facing two problems:
1. When I try to delete a user from a specific database,
it reports that that user cannot be dropped as it owns
objects. The db_owner is 'sa' not this user.
2. When I try to shrink the size of my log file, it
reports, "Cannot shrink log file 2 (cms1_log) because all
logical log files are in use."
Please help. Thanks.1. My guess is that the user own some objects. If you are used to
Enterprise Manager then expand each objects list and see if the user own
anything like "tables", "views", "store procedure", "user defined functions"
...
2. What is your recovery model? Right click server in EM, then click
properties, then options
"Rob" <rhchin@.hotmail.com> wrote in message
news:5d8901c35784$5b35fbf0$a001280a@.phx.gbl...
> Hello... I'm facing two problems:
> 1. When I try to delete a user from a specific database,
> it reports that that user cannot be dropped as it owns
> objects. The db_owner is 'sa' not this user.
> 2. When I try to shrink the size of my log file, it
> reports, "Cannot shrink log file 2 (cms1_log) because all
> logical log files are in use."
> Please help. Thanks.|||Rob,
#1..It could also be that that the user owns other objects like table,
stored procedures etc.Using the below query, you can find out the objects:
SELECT [name]
FROM sysobjects
WHERE uid=USER_ID('<user_name>')
Once there, try to change the ownership to another user, using
sp_changeobjectowner.
#2.Refer
INF: How to Shrink the SQL Server 7.0 Transaction Log
http://support.microsoft.com/support/kb/Articles/q256/6/50.asp
INF: Shrinking the Transaction Log in SQL Server 2000 with DBCC SHRINKFILE
http://support.microsoft.com/support/kb/Articles/q272/3/18.asp
Dinesh.
SQL Server FAQ at
http://www.tkdinesh.com
"Rob" <rhchin@.hotmail.com> wrote in message
news:5d8901c35784$5b35fbf0$a001280a@.phx.gbl...
> Hello... I'm facing two problems:
> 1. When I try to delete a user from a specific database,
> it reports that that user cannot be dropped as it owns
> objects. The db_owner is 'sa' not this user.
> 2. When I try to shrink the size of my log file, it
> reports, "Cannot shrink log file 2 (cms1_log) because all
> logical log files are in use."
> Please help. Thanks.|||Thanks for your response...
>Once there, try to change the ownership to another user,
using
>sp_changeobjectowner.
I can see many objects owned by the user I wish to drop.
However, when I try to use the sp_changeobjectowner sproc,
it tells me that the object does not exist or is not a
valid object for this operation.
Thanks again.

Cannot delete SQL 2000 backup file

I have a .bak file that I am trying to delete because the backup failed due
to insufficient space on the disk drive. The message I get is that it is
in use by another person or program. I cannot find the person or program.
What should I be looking for? I use native SQL backup.
Message posted via http://www.sqlmonster.com
You may have to stop the SQLServerAgent service so that it will release the
file.
If that doesn't work (try the MSSQLServer service). One of the two is
holding on to the file.
Rick Sawtell
MCT, MCSD, MCDBA
"Debra Camacho-Padron via SQLMonster.com" <forum@.SQLMonster.com> wrote in
message news:2a3f51e677de49ec92f681f87f14936c@.SQLMonster.c om...
> I have a .bak file that I am trying to delete because the backup failed
due
> to insufficient space on the disk drive. The message I get is that it is
> in use by another person or program. I cannot find the person or program.
> What should I be looking for? I use native SQL backup.
> --
> Message posted via http://www.sqlmonster.com
|||That's what I thought too. I tried both, and it still indicates that the
file is in use.
Message posted via http://www.sqlmonster.com
|||Try using FileMon from sysinternals to see what is accessing
the file:
http://www.sysinternals.com/ntw2k/source/filemon.shtml
-Sue
On Tue, 15 Mar 2005 18:01:24 GMT, "Debra Camacho-Padron via
SQLMonster.com" <forum@.SQLMonster.com> wrote:

>That's what I thought too. I tried both, and it still indicates that the
>file is in use.

Cannot delete SQL 2000 backup file

I have a .bak file that I am trying to delete because the backup failed due
to insufficient space on the disk drive. The message I get is that it is
in use by another person or program. I cannot find the person or program.
What should I be looking for? I use native SQL backup.
Message posted via http://www.droptable.comYou may have to stop the SQLServerAgent service so that it will release the
file.
If that doesn't work (try the MSSQLServer service). One of the two is
holding on to the file.
Rick Sawtell
MCT, MCSD, MCDBA
"Debra Camacho-Padron via droptable.com" <forum@.droptable.com> wrote in
message news:2a3f51e677de49ec92f681f87f14936c@.SQ
droptable.com...
> I have a .bak file that I am trying to delete because the backup failed
due
> to insufficient space on the disk drive. The message I get is that it is
> in use by another person or program. I cannot find the person or program.
> What should I be looking for? I use native SQL backup.
> --
> Message posted via http://www.droptable.com|||That's what I thought too. I tried both, and it still indicates that the
file is in use.
Message posted via http://www.droptable.com|||Try using FileMon from sysinternals to see what is accessing
the file:
http://www.sysinternals.com/ntw2k/source/filemon.shtml
-Sue
On Tue, 15 Mar 2005 18:01:24 GMT, "Debra Camacho-Padron via
droptable.com" <forum@.droptable.com> wrote:

>That's what I thought too. I tried both, and it still indicates that the
>file is in use.

Cannot delete SQL 2000 backup file

I have a .bak file that I am trying to delete because the backup failed due
to insufficient space on the disk drive. The message I get is that it is
in use by another person or program. I cannot find the person or program.
What should I be looking for? I use native SQL backup.
--
Message posted via http://www.sqlmonster.comYou may have to stop the SQLServerAgent service so that it will release the
file.
If that doesn't work (try the MSSQLServer service). One of the two is
holding on to the file.
Rick Sawtell
MCT, MCSD, MCDBA
"Debra Camacho-Padron via SQLMonster.com" <forum@.SQLMonster.com> wrote in
message news:2a3f51e677de49ec92f681f87f14936c@.SQLMonster.com...
> I have a .bak file that I am trying to delete because the backup failed
due
> to insufficient space on the disk drive. The message I get is that it is
> in use by another person or program. I cannot find the person or program.
> What should I be looking for? I use native SQL backup.
> --
> Message posted via http://www.sqlmonster.com|||That's what I thought too. I tried both, and it still indicates that the
file is in use.
--
Message posted via http://www.sqlmonster.com|||Try using FileMon from sysinternals to see what is accessing
the file:
http://www.sysinternals.com/ntw2k/source/filemon.shtml
-Sue
On Tue, 15 Mar 2005 18:01:24 GMT, "Debra Camacho-Padron via
SQLMonster.com" <forum@.SQLMonster.com> wrote:
>That's what I thought too. I tried both, and it still indicates that the
>file is in use.

Monday, March 19, 2012

cannot create new sql express database

Hi,

I am trying to create a new Sql Express Database (.mdf file) from inside Visual Studio 2005. I did the following:

1) Created a Database Project in Visual Studio.

2) Choose Data Source to be Microsoft SQL Server Database File

3) Browse to a directory and enter a file name for the new database, say abc.mdf

4) When I click OK, Visual Studio prompts me if I want to create the database file, I click yes.

5) After a while, I get this error message:

An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)

Then I opened SQL Server Configuration Manager, and I enabled Named Pipes and TCP/IP for both SQLEXPRESS and MSSQLSERVER. It prompts me that the settings will not change until the service is re-started.

When I go to Administrative Tools -> Services, I see that SQLServer (MSSQLSERVER) is started, and SQL Server (SQLEXPRESS) is stopped. I can stop the MSSQLSERVER service. But when I tried to start SQLEXPRESS service, I get the following error message:

Windows could not start the SQL Server (SQLEXPRESS) on Local Computer. For more information, review the System Event Lo9g. If this is a non-Microsoft service, contact the service vendor, and refer to service-specific error code 17058.

I have Visual Stuio 2005 and Sql Server 2005 Installed on my machine, and I believe I am logged on using an administrator account.

Can someone help me to figure out what's wrong please?

Thanks and regards,

Alice

It seems that something is wrong with your SQL Express isntallation if it won't start. You should check the error log to see if it give more details about the problem starting the service. The error log is generally located at C:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\LOG (Your InstanceID may be different).

When you check the error log, please also verify that you are not running a beta version of SQL Express. The version number should contain either 9.00.1399 or 9.00.2047.

Mike

|||

Hi,

It turns out SQL Express wasn't installed properly. I re-installed it and it works fine now. Thanks a lot.

Alice

Cannot create new SQL database file.

Hello,

I'm using MS Visual C# 2005 Express, but I can't create a new local database file in the "Add -> New Item" dialog. The Studio doesn't create the database, but shows a file not found message.

Any ideas?

Can you connect to SQL Server Express and create the database manually? If yes, then the issue is within Visual C# and you should post this issue on a Visual C# Express forum. If you encounter an error with manually creating the database, let us know what that error is.

Thanks
Laurentiu

|||

I can connect to the server with the SQL Server Management Studio und create new databases. But Visual C# could't even connect to any existing database. If I click on "Test connection" in add data source dialog the following message shows up:

"Unable to open physical file <path>. Operating System error 32 (Error not found) An attempt to attach a auto-named database failed. A database with the same name exists or the specified file cannot be opened or it's located on an UNC share."

I haven't much experience with the SQL Server, therefore I have now idea what this error means. In fact there is no database with the same name and don't know why it couldn't access the file. Unfortunately I've never heard anything about UNC shares before.

Are there some important steps I have to consider before I create a database manually?

Thanks.

|||

Let's take these as two different problems; connecting to an existing database and creating a database. It's easy to start mixing up the behaivors and confuse the issue.

Creating a DB in VS

Could you describe the exact steps you're using in VS to create a new database? What is the exact error message you're getting? Does your effort result in a database file being created and where does the file end up?

Connecting to an existing DB in VS

Again, what are the exact steps you're using to connect to the database? Is the database already attached to SQL Express? Where is the database file located?

Thanks for the additional detail.

Regards,

Mike Wachal
SQL Express team

-
Please mark your thread as Answered when you get your solution.

|||

Create a DB in VS

When I tried to create a new DB in VS I followed this steps: First of all I loaded my solution then made a right click on my project in the solution explorer and selected "Add Item -> New Item" in the context menu. Finally I selected the SQL DB in the "Add New Item" dialog in order to add a local database file to my project. But it still doesn't work although I have tried it with various file names. VS shows up a short error message ("The system cannot find the specified file.") with the title "Local database file" I cannot find a database file too.

Is it necessary to make any further inititialisations before I can use SQL files in VS? I've already done a lot with VS but I never used SQL before. Because of that I didn't configure anything but the process or service is running in the background.

Connecting to an existing DB in VS

In order to isolate the problem I created a new DB with the MS SQL Server Management Studio Express and tried to connect in VS to the new database considering the following steps: I clicked on "Add New Data Source" in the Data Source Frame in VS. Then I selected a database in the dialog and the file of the database I had created with the Management Studio before. If I detached the DB the VS test connection succeeds but if I click on "OK" VS shows again the same error as if I created the DB directly in VS. ("The system cannot find the specified file.") If I didn't detach the DB, VS failed to establish a test connection because the file is already in use. I moved the database file in different directories but nothing changed.

In the end it seems to me that VS can neither create a database nor connect to a existing one.

Thanks for your support.

Bj?rn

|||

Hi Bjorn,

Create DB in VS

I'll see if any of the VS folks have an idea on this one. What you describe should work.

Connecting to Existing DB in VS

The error you're getting here is expected. In this case, VS is actually creating a connection string that starts a new instance of SQL Express (called a User Instance) and attaches the database to that new instance. Since you can only have a database attached to a single instance of SQL, you get an error. As you've noticed, this problem doesn't happen when you've detached the database from the main instance.

Regards,

Mike Wachal
SQL Express team

-
Check out my tips for getting your answer faster and how to ask a good question: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=307712&SiteID=1

|||

A couple of things. First, it sounds like SQL Server Express (SSE) is not installed on your machine. SSE allows you to make file-based connections to a database without a logical database name (as is the default behavior for the "local" database story in VS.

Secondly, if you are using VC#, VB, or VC Express, you cannot even connect to an up-level version of SQL Server. You can only connect to SSE (the free version of SQL Server). And, in particular, you can *only* make a local database style connection via the tools.

For the version of SQL Server that is running, see if the instance name is "SQLExpress". If it is not, that is the most likely problem. You can download it from the web for free.

Lance

This posting is provided "AS IS" and confers no rights and provides no warranties.

|||

First of all thanks for your support and you are right. It seems I have a SQL Server 2005 version on my system instead of SQL 2005 Express. Therefore I downloaded the SQL Express Version (with Advanced Services) from Microsoft.com but it turns out that I've got another problem. Since I already had an Running SQL Server on my machine (named "Microsoft SQL Server 2005" in control panel->software and running 2 background processes SQLServ and SQLwriter), I followed the download notice and tried to remove the old version. Unfortunately the wizards halts with this meaningful error:

"TITLE: Microsoft SQL Server 2005 Setup. MESSAGE: There was an unexpected failure during the setup wizard. You may review the setup logs and/or click the help button for more information."

The message comes up whether I stop all SQL processes in the SQL Configuration Manager or not. Furthermore the help button and the related help page don't provide any additional information.

Hence I tried out the other way outlined by the installation instructions. I removed all files from the "\MSSQL.1\MSSQL\Template Data\" directory and ran the installation routine of SQL 2005 Express afterwards, but surprisingly the new wizards halts with the same error already mentioned above.

It's maybe useful to consider that I didn't install the old version explicitly. I guess it comes bundled with Visual C# 2005 Express.

bj?rn

|||

Hi Bjorn,

There seems to be some confusion about whether you actually have SQL Express or not. You've said you don't, but the Template Data directory should even exist unless you actually have SQL Express installed. We need to clear up what you've installed and what you haven't. All versions of VS Express give you the option to install SQL Express.

To determine what you have installed, could you open up SQL Configuration Manager and let me know how many instances of SQL you have listed and what each of them is named?

Also, there should be more information in the setup error logs about the failure you're experiencing. Could you take a look at those logs and let me know what they say?

Thanks for the additional information.

Regards,

Mike Wachal
SQL Express team

-
Check out my tips for getting your answer faster and how to ask a good question: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=307712&SiteID=1

|||

Hello,

There are two instances listed in the SQL Configuration Manager. The first is labeled "SQL Server (SQLExpress)" and is running. The second one is "SQL Server Browser", it's stopped and I can't start it (the run button is disabled).

I found the following setup log:

Microsoft SQL Server 2005 Setup beginning at Wed Apr 26 13:59:32 2006
Process ID : 4028
D:\Programme\Microsoft SQL Server\90\Setup Bootstrap\setup.exe Version: 2005.90.1399.0
Running: LoadResourcesAction at: 2006/3/26 13:59:32
Complete: LoadResourcesAction at: 2006/3/26 13:59:32, returned true
Running: ParseBootstrapOptionsAction at: 2006/3/26 13:59:32
Loaded DLL:D:\Programme\Microsoft SQL Server\90\Setup Bootstrap\xmlrw.dll Version:2.0.3604.0
Complete: ParseBootstrapOptionsAction at: 2006/3/26 13:59:32, returned false
Error: Action "ParseBootstrapOptionsAction" failed during execution. Error information reported during run:
Could not parse command line due to datastore exception.
Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 124
-
writeEncryptedString() failed
Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 123
-
Error Code: 0x80070002 (2)
Windows Error Text: Das System kann die angegebene Datei nicht finden.

Source File Name: cryptohelper\cryptsameusersamemachine.cpp
Compiler Timestamp: Mon Jun 13 14:30:00 2005
Function Name: sqls::CryptSameUserSameMachine::ProtectData
Source Line Number: 50

2
Running: ValidateWinNTAction at: 2006/3/26 13:59:32
Complete: ValidateWinNTAction at: 2006/3/26 13:59:32, returned true
Running: ValidateMinOSAction at: 2006/3/26 13:59:32
Complete: ValidateMinOSAction at: 2006/3/26 13:59:32, returned true
Running: PerformSCCAction at: 2006/3/26 13:59:32
Complete: PerformSCCAction at: 2006/3/26 13:59:32, returned true
Running: ActivateLoggingAction at: 2006/3/26 13:59:32
Error: Action "ActivateLoggingAction" threw an exception during execution. Error information reported during run:
Datastore exception while trying to write logging properties.
Source File Name: datastore\cachedpropertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:49 2005
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
-
Failed to find property "primaryLogFiles" {"SetupStateScope", "", ""} in cache
Source File Name: datastore\propertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:50 2005
Function Name: SetupStateScope.primaryLogFiles
Source Line Number: 44
-
No collector registered for scope: "SetupStateScope"
00D7CFC4Unable to proceed with setup, there was a command line parsing error. : 2
Error Code: 0x80070002 (2)
Windows Error Text: Das System kann die angegebene Datei nicht finden.

Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 123

Failed to create CAB file due to datastore exception
Source File Name: datastore\cachedpropertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:49 2005
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
-
Failed to find property "HostSetup" {"SetupBootstrapOptionsScope", "", "4028"} in cache
Source File Name: datastore\propertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:50 2005
Function Name: SetupBootstrapOptionsScope.HostSetup
Source Line Number: 44
-
No collector registered for scope: "SetupBootstrapOptionsScope"
Message pump returning: 2

I installed it with VS C# Express and the version is Microsoft SQL Server 2005 - 9.00.1399.06 (Intel X86).

Bj?rn

|||

Hi Bjorn,

At this point I'm not sure what's up with your server. From your first message there was something wrong with your copy of C# creating database, but SQL Express seemed find based on the fact that it worked with Management Studio.

If I were you, I'd try to uninstall SQL Express to get it completely off your system and then do a clean installation of either SQL Express or SQL Express with Advanced Services, which ever one you want. I'd try to get back to the state where SQL is working, then if there is still a problem with C#, we can push this thread to one of the VS groups to see if they have any ideas how to fix the VS problem.

Mike

|||

Hello,

I already tried to remove SQL Server from my system, but unfortunately there's the same problem. So I can't even uninstall it.

Errorlog:

Microsoft SQL Server 2005 Setup beginning at Wed May 03 13:16:44 2006
Process ID : 2348
D:\Programme\Microsoft SQL Server\90\Setup Bootstrap\setup.exe Version: 2005.90.1399.0
Running: LoadResourcesAction at: 2006/4/3 13:16:44
Complete: LoadResourcesAction at: 2006/4/3 13:16:44, returned true
Running: ParseBootstrapOptionsAction at: 2006/4/3 13:16:44
Loaded DLL:D:\Programme\Microsoft SQL Server\90\Setup Bootstrap\xmlrw.dll Version:2.0.3604.0
Complete: ParseBootstrapOptionsAction at: 2006/4/3 13:16:44, returned false
Error: Action "ParseBootstrapOptionsAction" failed during execution. Error information reported during run:
Could not parse command line due to datastore exception.
Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 124
-
writeEncryptedString() failed
Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 123
-
Error Code: 0x80070002 (2)
Windows Error Text: Das System kann die angegebene Datei nicht finden.

Source File Name: cryptohelper\cryptsameusersamemachine.cpp
Compiler Timestamp: Mon Jun 13 14:30:00 2005
Function Name: sqls::CryptSameUserSameMachine::ProtectData
Source Line Number: 50

2
Running: ValidateWinNTAction at: 2006/4/3 13:16:44
Complete: ValidateWinNTAction at: 2006/4/3 13:16:44, returned true
Running: ValidateMinOSAction at: 2006/4/3 13:16:44
Complete: ValidateMinOSAction at: 2006/4/3 13:16:44, returned true
Running: PerformSCCAction at: 2006/4/3 13:16:44
Complete: PerformSCCAction at: 2006/4/3 13:16:44, returned true
Running: ActivateLoggingAction at: 2006/4/3 13:16:44
Error: Action "ActivateLoggingAction" threw an exception during execution. Error information reported during run:
Datastore exception while trying to write logging properties.
Source File Name: datastore\cachedpropertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:49 2005
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
-
Failed to find property "primaryLogFiles" {"SetupStateScope", "", ""} in cache
Source File Name: datastore\propertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:50 2005
Function Name: SetupStateScope.primaryLogFiles
Source Line Number: 44
-
No collector registered for scope: "SetupStateScope"
00D7CFC4Unable to proceed with setup, there was a command line parsing error. : 2
Error Code: 0x80070002 (2)
Windows Error Text: Das System kann die angegebene Datei nicht finden.

Source File Name: utillib\persisthelpers.cpp
Compiler Timestamp: Fri Jul 29 01:13:55 2005
Function Name: writeEncryptedString
Source Line Number: 123

Failed to create CAB file due to datastore exception
Source File Name: datastore\cachedpropertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:49 2005
Function Name: CachedPropertyCollection::findProperty
Source Line Number: 130
-
Failed to find property "HostSetup" {"SetupBootstrapOptionsScope", "", "2348"} in cache
Source File Name: datastore\propertycollection.cpp
Compiler Timestamp: Fri Jul 29 01:13:50 2005
Function Name: SetupBootstrapOptionsScope.HostSetup
Source Line Number: 44
-
No collector registered for scope: "SetupBootstrapOptionsScope"
Message pump returning: 2

Is it possible that my windows installation is corrupt? But I haven't any other problems except that my system is quite slow.

Bj?rn

|||

Hi Bjorn,

It looks like something failed to uninstall correctly. You can find some suggestions on how to clean up your systems at the following link :

http://blogs/jdbaker/archive/2006/04/20/10684.aspx

I hope that helps

Michelle

Cannot Create Full-Text Catalog

I had a perfectly good FT catalog, but I was trying to repopulate it after
making changes to the noiseeng file. I restarted the Full-Text search
engine.
When I tried to run a full-text query against the table, I got this error
"no such interface supported". In the message there was a recommendation to
create a new catalog.
I decided to delete the existing catalog, but now cannot create a new one.
I get the following full message:
TITLE: Microsoft SQL Server Management Studio
Cannot execute changes.
Create failed for FullTextCatalog 'FTCDoctors'. (Microsoft.SqlServer.Smo)
For help, click:
[url]http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=9.00. 3042.00&EvtSrc=Microsoft.SqlServer.Management.Smo. ExceptionTemplates.FailedOperationExceptionText&Ev tID=Create+FullTextCatalog&LinkId=20476[/url]
An exception occurred while executing a Transact-SQL statement or batch.
(Microsoft.SqlServer.ConnectionInfo)
Execution of a full-text operation failed. 'No such interface supported'
(Microsoft SQL Server, Error: 7689)
For help, click:
[url]http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00 .3054&EvtSrc=MSSQLServer&EvtID=7689&LinkId=20476[/url]
The links are dead links now. Figures.
Can anyone help me?
Thanks, Ross
Unable to find a cure, I uninstalled the Full-Text engine, rebooted,
reinstalled Full-Text, rebooted and was able to rebuild the catalog and
index. But I'd sure like to know what happened and what this error really
means.
Ross
"Ross Culver" <rculver@.warrenalloy.com> wrote in message
news:eOCFu7brHHA.4180@.TK2MSFTNGP04.phx.gbl...
>I had a perfectly good FT catalog, but I was trying to repopulate it after
> making changes to the noiseeng file. I restarted the Full-Text search
> engine.
> When I tried to run a full-text query against the table, I got this error
> "no such interface supported". In the message there was a recommendation
> to
> create a new catalog.
> I decided to delete the existing catalog, but now cannot create a new one.
> I get the following full message:
> TITLE: Microsoft SQL Server Management Studio
> Cannot execute changes.
> Create failed for FullTextCatalog 'FTCDoctors'. (Microsoft.SqlServer.Smo)
> For help, click:
> [url]http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=9.00. 3042.00&EvtSrc=Microsoft.SqlServer.Management.Smo. ExceptionTemplates.FailedOperationExceptionText&Ev tID=Create+FullTextCatalog&LinkId=20476[/url]
> --
> An exception occurred while executing a Transact-SQL statement or batch.
> (Microsoft.SqlServer.ConnectionInfo)
> --
> Execution of a full-text operation failed. 'No such interface supported'
> (Microsoft SQL Server, Error: 7689)
> For help, click:
> [url]http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=09.00 .3054&EvtSrc=MSSQLServer&EvtID=7689&LinkId=20476[/url]
> The links are dead links now. Figures.
> Can anyone help me?
> Thanks, Ross
>
>

Thursday, March 8, 2012

Cannot create a file when that file already exists

Platform: VC 7.0, NT4.0
Database: SQL Server 2000
Hi,
I created an in-proc server. It uses registry related function like
RegOpenKeyEx.
When I execute this COM object from Visual Basic it works fine.
When I execute same COM object from SQL Server Store procedure it files and
returns error like "Cannot create a file when that file already exists". May
I know why same code fails in case of SQL Server?
Thanks in advance.
Regards
VijayHi
CREATE FUNCTION dbo.fn_file_exists(@.filename VARCHAR(300))
RETURNS INT
AS
BEGIN
DECLARE @.file_exists AS INT
EXEC master..xp_fileexist @.filename, @.file_exists OUTPUT
RETURN @.file_exists
END
GO
-- test
SELECT dbo.fn_file_exists('d:\test1.txt')
"vijay" <vijaysingh@.abo.abosoftware.com> wrote in message
news:ujQzO7VxDHA.1856@.TK2MSFTNGP09.phx.gbl...
> Platform: VC 7.0, NT4.0
> Database: SQL Server 2000
>
> Hi,
> I created an in-proc server. It uses registry related function like
> RegOpenKeyEx.
> When I execute this COM object from Visual Basic it works fine.
> When I execute same COM object from SQL Server Store procedure it files
and
> returns error like "Cannot create a file when that file already exists".
May
> I know why same code fails in case of SQL Server?
>
> Thanks in advance.
>
> Regards
> Vijay
>

Cannot create a file through a file share subscription

Hi
I am trying to create a file share subscription for a report. I have
specified a valid unc path for the file.
When the schedule is run ny SQL Agent, it returns the following error:
"Failure writing file Depth Distribution : The Report Server has encountered
a configuration error; more details in the log files"
When I look at the log file, there is no error indicated. It just says the
following:
w3wp!subscription!1ec4!05/04/2005-15:26:58:: Subscription Created for report
/TRACKERS Templates/Dist Tracker Template/Depth Distribution at
2005-05-04T15:26:58 by NSchneider
w3wp!library!1358!05/04/2005-15:26:59:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:26:59:: i INFO: Call to GetSystemPermissions
w3wp!library!1ec4!05/04/2005-15:27:02:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1ec4!05/04/2005-15:27:02:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:02:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:02:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:03:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:03:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:04:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:04:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:06:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:06:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:09:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:09:: i INFO: Call to GetSystemPermissions
w3wp!library!1ec4!05/04/2005-15:27:09:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1ec4!05/04/2005-15:27:09:: i INFO: Call to GetSystemPermissions
w3wp!library!1358!05/04/2005-15:27:10:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1358!05/04/2005-15:27:10:: i INFO: Call to GetSystemPermissions
w3wp!library!1ec4!05/04/2005-15:27:10:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1ec4!05/04/2005-15:27:10:: i INFO: Call to GetSystemPermissions
w3wp!library!1ec4!05/04/2005-15:27:11:: i INFO: Call to
GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
w3wp!library!1ec4!05/04/2005-15:27:11:: i INFO: Call to GetSystemPermissions
Does anyone know what is going on here?
Thanking you in advanceHi,
This could be a permission issue. Make sure the account executed the
subscription is in local user account and has access permission to the share
path.
Charles
"PORRASTAR" wrote:
> Hi
> I am trying to create a file share subscription for a report. I have
> specified a valid unc path for the file.
> When the schedule is run ny SQL Agent, it returns the following error:
> "Failure writing file Depth Distribution : The Report Server has encountered
> a configuration error; more details in the log files"
> When I look at the log file, there is no error indicated. It just says the
> following:
> w3wp!subscription!1ec4!05/04/2005-15:26:58:: Subscription Created for report
> /TRACKERS Templates/Dist Tracker Template/Depth Distribution at
> 2005-05-04T15:26:58 by NSchneider
> w3wp!library!1358!05/04/2005-15:26:59:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:26:59:: i INFO: Call to GetSystemPermissions
> w3wp!library!1ec4!05/04/2005-15:27:02:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1ec4!05/04/2005-15:27:02:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:02:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:02:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:03:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:03:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:04:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:04:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:05:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:06:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:06:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:09:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:09:: i INFO: Call to GetSystemPermissions
> w3wp!library!1ec4!05/04/2005-15:27:09:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1ec4!05/04/2005-15:27:09:: i INFO: Call to GetSystemPermissions
> w3wp!library!1358!05/04/2005-15:27:10:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1358!05/04/2005-15:27:10:: i INFO: Call to GetSystemPermissions
> w3wp!library!1ec4!05/04/2005-15:27:10:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1ec4!05/04/2005-15:27:10:: i INFO: Call to GetSystemPermissions
> w3wp!library!1ec4!05/04/2005-15:27:11:: i INFO: Call to
> GetPermissions:/TRACKERS Templates/Dist Tracker Template/Depth Distribution
> w3wp!library!1ec4!05/04/2005-15:27:11:: i INFO: Call to GetSystemPermissions
> Does anyone know what is going on here?
> Thanking you in advance
>

Cannot copy the mdf file after processing

I have a windows service that connects to a regular sql server 2005 database and basically bulk copies data into a SQL express database. Afterwards, I just want to copy the MDF file into a different directory. I keep getting the "Process cannot access file because it is being used by another process" error. I've tried changing the connection strings but nothing seems to work. I'm closing the connections in the code as well. Here is the code. Any thoughts or help would be appreciated. RefreshDB calls Sync 3 times and if all three calls are successful, it attempts to copy the database file to the specified location. This is where I get the error.

Public Function RefreshDB() As Boolean
Dim success As Boolean = True
Dim tables() As String = {"Job", "Equipment", "PMScheduled"}
For Each tableName As String In tables
If SyncTable(tableName) = False Then
success = False
Exit For
End If
Next
If success Then
'copy the new database to the target directory
Try
File.Copy(My.Settings.DBFilePath, My.Settings.TargetDirectory + "\EMField.mdf")
Catch ex As Exception
My.Application.Log.WriteEntry(ex.Message + "(RefreshDB)", TraceEventType.Critical)
Return False
End Try
End If
Return success
End Function


Private Function SyncTable(ByVal tableName As String) As Boolean
Dim reader As SqlDataReader
Dim sourceViewName As String = String.Format("EMField{0}View", tableName)
Dim connectionString As String = My.Settings.EMFieldConnectionString
'insert the path to the database into the connectionstring
connectionString = connectionString.Replace("[DBFilePath]", My.Settings.DBFilePath)
Try
'clear the target table first
Using targetConnection As New SqlConnection(connectionString)
Using truncateCommand As New SqlCommand(String.Format("TRUNCATE TABLE {0}", tableName), targetConnection)
targetConnection.Open()
truncateCommand.ExecuteNonQuery()
targetConnection.Close()
End Using
End Using
'create a datareader from the source database
Using sourceConnection As New SqlConnection(My.Settings.EMLiteConnectionString)
Using readCommand As New SqlCommand(String.Format("SELECT * FROM {0}", sourceViewName), sourceConnection)
sourceConnection.Open()
reader = readCommand.ExecuteReader
BulkCopy(tableName, reader, connectionString)
reader.Close()
sourceConnection.Close()
End Using
End Using
Return True
Catch ex As SqlException
My.Application.Log.WriteEntry(ex.Message + "(sync)", TraceEventType.Critical)
Return False
End Try
End Function

Public Sub BulkCopy(ByVal tableName As String, ByRef reader As SqlDataReader, ByVal connectionString As String)
Try
'bulk copy from source to target database
Using bulkCopy As New SqlBulkCopy(connectionString)
bulkCopy.DestinationTableName = tableName
bulkCopy.WriteToServer(reader)
bulkCopy.Close()
End Using
Catch ex As SqlException
'throw exception back to calling sub
Throw ex
End Try
End Sub

hi,

the database is probably still in use and an active connection is still referencing it, so even if the database autoclose property is set (for SQLExpress created databases), the physical file(s) is/are still locked and file system operations are not allowed..

first verify no active connection is still alive.. consider that connection pooling will keep them in the pool for about 1 minute after the connection has been disposed and released..

again, even if the autoclose property set is the default for SQLExpress instances, consider first "detaching" the database via sp_detach_db before performing file system operations...

regards

|||

Thanks for your help. I can say with certainty that no other process is connected the sql express database because I am testing it on my own machine. That leaves the connection pooling, but I put pooling=false in the connection string. I have never "detached" a sql express database before. So where exactly (ie what database) do I run this sp_detach_db and are there any consequences in doing so? In other words when the service runs again, will I need to reattach the database before connecting to it?

Thanks,

Bob

|||

hi Bob,

Coach24 wrote:

Thanks for your help. I can say with certainty that no other process is connected the sql express database because I am testing it on my own machine. That leaves the connection pooling, but I put pooling=false in the connection string. I have never "detached" a sql express database before. So where exactly (ie what database) do I run this sp_detach_db

sp_detach_db "unregister" the database from it's related SQL Server instance so that file system operations are allowed on the database's physical files..

and are there any consequences in doing so? In other words when the service runs again, will I need to reattach the database before connecting to it?

Thanks,

Bob

no consequences but, obviously, any connection to the unregistered database will fail until you "reattach" it via the

USE [master]

GO

CREATE DATABASE [test] ON

( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL.3\MSSQL\Data\test.mdf' ),

( FILENAME = N'C:\Program Files\Microsoft SQL Server\MSSQL.3\MSSQL\Data\test_log.LDF' )

FOR ATTACH;

GO

statement... so, yes, you have to reattach the database before referencing it..

regards

|||

While I'm certain your suggestion works, I ended up not needing it. As it turns out, even though I had pooling = false in the connection string, and I closed the connection in the code, the connection was still open when I attempted the File.Copy statement. I put a System.Threading.Thread.Sleep(5000) right before the File.Copy statement and that fixed the problem. Thanks again for your help.

Bob

Wednesday, March 7, 2012

cannot convert between unicode and non-unicode with SQL Server

I keep getting the error message below when attempting to import a text file (flat file) to an SQL Server Destination using SSIS. This database has recently been migrated from SQL Server 2000 (where I used a DTS Package that worked fine). I plan on doing further manipulation to the file after importing but have so far tried to keep one step at a time. I have tried using a smaller files and different tables but still get a similar response. I believe that the encoding of the text file is ANSI. If it is relevant the database collation is Latin1_General_CI_AS (the same as it was when 2000). Any help anyone can provide will be greatly appreciated!!!

TITLE: Package Validation Error

Package Validation Error


ADDITIONAL INFORMATION:

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 0" and "AccountNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 1" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 2" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 3" and "Name" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 4" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 5" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 6" and "ExpiryDate" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 7" and "RateType" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 8" can't be inserted because the conversion between types DT_STR and DT_BOOL is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 9" and "FullName" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 10" and "Address" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 11" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 12" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 13" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 14" and "Occupancy" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 15" and "LoanPurpose" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 16" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 17" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 18" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 19" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 20" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 21" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 22" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 23" and "DocumentLocation" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 24" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 25" and "SecurityType" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 26" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 27" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 28" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 29" and "MortgageInsurancePolicyNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 30" and "SecurityAddress" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 31" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 32" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 33" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 34" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 35" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 36" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 37" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 38" and "SecuritySuburb" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 39" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 40" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 41" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 42" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 43" and "MortgageNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 44" and "TitleParticulars" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 45" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 46" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 47" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 48" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 49" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 50" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 51" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (174)" failed validation and returned validation status "VS_ISBROKEN".

Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

Error at Data Flow Task: There were errors during task validation.

(Microsoft.DataTransformationServices.VsIntegration)


BUTTONS:

OK

The errors are exactly as are stated. You are trying to do implicit conversions that do not work. e.g. In column1 the value is coming in as a String and you are trying to insert it into an integer column.

You will have to open up the package and use the Data Conversion component or the Derived Column component to do explicit conversions.

-Jamie

|||

Thanks for the reply Jamie, I have tried using both of these and still get a similar conversion error (as below on a similar table). I have tried converting to different datatypes but still get the same conversion error.

The same database had no problem using DTS on 2000.

I must missing something that is probably blatantly obvious, but do you have any other ideas?

TITLE: Package Validation Error

Package Validation Error


ADDITIONAL INFORMATION:

Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionKey" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "AccountNumber" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: Column "TransactionCode" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "Acmount" can't be inserted because the conversion between types DT_R4 and DT_CY is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionDate" can't be inserted because the conversion between types DT_DATE and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: Column "Narration" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (4395)" failed validation and returned validation status "VS_ISBROKEN".

Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

Error at Data Flow Task: There were errors during task validation.

(Microsoft.DataTransformationServices.VsIntegration)


BUTTONS:

OK

|||

In SQL Server 2000, DTS used to do implicit conversions in many circumstances. The decision was made to eliminate that risky guessing in Integration Services, and to compel all data type conversions, truncations, etc. to be made explicitly.

Note that if you use the Import and Export Wizard to create your "base" package, the wizard creates all these tedious conversion transformations for you.

-Doug

|||

I'm getting the '...can't convert unicode...' conversion errors for strings imported from a table in Access 97.

I see that the Import Wizard does it's job and will successfully import the data into a new table.

But if you then change the destination to a table that already exists (created by a db copy from SQL2k) then the wizard generated package fails with the unicode conversion error again.

I've changed the destination column properties to match those created in the new table by the wizard - and it still fails.

I suspect that I could get the package copy the import into my table by doing an INSERT and then drop the table that the wizard creates. But I'm sure there's a more direct way than this.

I suspect I'm overlooking something obvious - but it is proving difficult to manage data between Microsoft family applications.

Any help / links appreciated.

|||

I

have the same error trying to import an Access database with a memo

field type. SQL Server 2005 SSIS can't convert the memo to a varchar

and gives the unicode error. Odd.|||

I have found the same problem connecting to a Lotus Notes database via the NotesSQL 3.02g ODBC driver.

Consider the following article on unicode.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/acdata/ac_8_con_03_6voh.asp

When I changed the datatypes of my destination table from varchar to nvarchar, I no longer received the "cannot convert between unicode and non-unicode" error.

|||To be clear: String conversion errors can be resolved by changing char columns to nchar and varchar columns to nvarchar in SQL2005.|||

To be really clear, conversion errors happen because we use types inconsistently. There are unfortunately some instances when the type is forced upon us, but it is easy enough to change types inside SSIS. Changing existing database structures to simplify your ETL is not a great justification, alebit if possible it may make life easier. So in summary changing SQL Server column types will work, but so will converting the data within the pipeline, see the Data Conversion transform or the Derived Column transform.

|||

Well, I guess just another in a long line of folks hitting this trip wire. I'm trying to import an Excel file into an existing SQL table which as varchar columns, not nvchar. And, you're right, I'm not changing the table definition!

OK, so please forgive the basic nature of the question, but you say it is easy enough to change types in SSIS. So, you have to drag one of those transforms (Data Conversion or Derived Column) then onto your package?

I was just thinking that there ought to be a more native way to do this. But I guess this is it, then?

For example, I right-clicked by source data file and went to the Advanced Editor. There's a Input and Output Properties tab that shows Excel output in a tree view. When expanded, in turn, there are External and Output columns nodes.

I tried changing these to my desired data types, but it didn't seem to a) save my changes much less b) make the package work. So what's does one use the Input and Output Properties for?

Thanks for any information (or if you're up to adding an indepth article about it on your SQLIS.com website!). BOL does not seem to offer any meaningful information here.

|||

External columns are what the source actually contains. Output columns are what SSIS gives you downstream. You cannot change the types of either because the external types are decided by the underlying Excel provider not you, and you cannot change the output columns because they should match the external columns. External vs output is a bit like which columns have you selected to bring through to SSIS out of those available. So changing types as part of your column choice does not make sense, it would only lead to problems and failures.

The lack of implicit conversion is annoying for most people, but you can workaround. If implicit conversions take place and they are wrong, then you are generally stuck. The strong typing of SSIS is better in the long run I firmly believe, although I do often think it would be nice the other way too. Think of the way strings can be interpreted as dates. How may times have you hot problems with MD vs DM (if you in the UK all the time!). VBScript was my favourite as that would convert dates both ways depending on what the value was, all within the same function! You had no control and now idea which way it would go. Hence my assertion that not having implicit conversions are better. On the other hand some more help could be given in the product, such as selectively allowing similar types to be converted, STR to WSTR and I2 to I4. There will still be issues no doubt for more international organisations with multiple extended character sets or those using different code pages.

Data Conversion or Derived Column is the way to go I'm afraid.

|||Again thanks for the update.

So, I'm trying a simple test at this point. Starting with a brand new package, I set up an Excel source, then dragged a Derived Column transformation onto the Data Flow designer. I selected just two columns to test w/ by dragging them from the Columns pane down to the Expressions field, putting a TRIM() around 'em, entered a custom name for each in the Derived Column Name field, selecting <add as new column> and selecting string [DT_STR] in the Data Type field.

Then set up my SQL table as the destination and mapped the two derived columns to the corresponding columns in my table.

I'm still getting task validation errors.

"cannot convert between unicode and non-unicode string data types."

So, this makes no sense to me. In the derived transformation editor, I have the data types set to string; in my SQL table, the columns are varchar's. I'm just not sure what I else I should be doing...

Any suggestions would be welcome.|||

Although you have selected the type of the output column to be DT_STR you haven't told SSIS how to convert the value into a DT_STR so in essence all you have done is define another implicit conversion.

Explicit conversions are done using type casts. The following expression will convert a column of type DT_WSTR into a DT_STR:

(DT_STR, <length>, 1252) (<column_name>)

Hope that helps.

-Jamie

|||

I am experiencing a similar error using the Microsoft OLEDB provider for oracle as a source. All data types are defined as nvarchar2 in Oracle and exist in a sql server 2000 db typed as nvarchar. Since these are both unicode types, why am I getting the same error?

I did find that the .net oracle provider fixed the unicode/non unicode error however the size of the source field is 1.5 times in the destination. The package will run but I get truncation warnings throughout execution.

Any feedback on these is greatly appreciated!

Thanks,

Barbara

|||

I am having the same problem with SQL Server as a source and destination. I am using a "DataReader Source" and specifying a SQL query. I then pass it through Merge Join and Conditional Split transforms. My destination is an OLE DB Destination. If I pick an existing destination table that has EXACTLY the same schema as the source table, the data flow fails with the same message everyone is getting above. Somewhere in the process the package is converting the source data from varchar to nvarchar. I can only get it to work if I have it create a new table in the OLE DB Destination, in which case it gives me nvarchar columns (identical schema as source except it changes the varchars to nvarchars).

Where in the package is it converting from varchar to nvarchar? It this a known issue? I can tell you with certainty that the source fields are not unicode.

cannot convert between unicode and non-unicode with SQL Server

I keep getting the error message below when attempting to import a text file (flat file) to an SQL Server Destination using SSIS. This database has recently been migrated from SQL Server 2000 (where I used a DTS Package that worked fine). I plan on doing further manipulation to the file after importing but have so far tried to keep one step at a time. I have tried using a smaller files and different tables but still get a similar response. I believe that the encoding of the text file is ANSI. If it is relevant the database collation is Latin1_General_CI_AS (the same as it was when 2000). Any help anyone can provide will be greatly appreciated!!!

TITLE: Package Validation Error

Package Validation Error


ADDITIONAL INFORMATION:

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 0" and "AccountNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 1" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 2" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 3" and "Name" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 4" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 5" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 6" and "ExpiryDate" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 7" and "RateType" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 8" can't be inserted because the conversion between types DT_STR and DT_BOOL is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 9" and "FullName" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 10" and "Address" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 11" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 12" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 13" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 14" and "Occupancy" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 15" and "LoanPurpose" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 16" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 17" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 18" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 19" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 20" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 21" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 22" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 23" and "DocumentLocation" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 24" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 25" and "SecurityType" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 26" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 27" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 28" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 29" and "MortgageInsurancePolicyNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 30" and "SecurityAddress" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 31" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 32" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 33" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 34" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 35" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 36" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 37" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 38" and "SecuritySuburb" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 39" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 40" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 41" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 42" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 43" and "MortgageNumber" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: Columns "Column 44" and "TitleParticulars" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 45" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 46" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 47" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 48" can't be inserted because the conversion between types DT_STR and DT_I4 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 49" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 50" can't be inserted because the conversion between types DT_STR and DT_R8 is not supported.

Error at Data Flow Task [SQL Server Destination [174]]: The column "Column 51" can't be inserted because the conversion between types DT_STR and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (174)" failed validation and returned validation status "VS_ISBROKEN".

Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

Error at Data Flow Task: There were errors during task validation.

(Microsoft.DataTransformationServices.VsIntegration)


BUTTONS:

OK

The errors are exactly as are stated. You are trying to do implicit conversions that do not work. e.g. In column1 the value is coming in as a String and you are trying to insert it into an integer column.

You will have to open up the package and use the Data Conversion component or the Derived Column component to do explicit conversions.

-Jamie

|||

Thanks for the reply Jamie, I have tried using both of these and still get a similar conversion error (as below on a similar table). I have tried converting to different datatypes but still get the same conversion error.

The same database had no problem using DTS on 2000.

I must missing something that is probably blatantly obvious, but do you have any other ideas?

TITLE: Package Validation Error

Package Validation Error


ADDITIONAL INFORMATION:

Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionKey" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "AccountNumber" can't be inserted because the conversion between types DT_NTEXT and DT_WSTR is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: Column "TransactionCode" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "Acmount" can't be inserted because the conversion between types DT_R4 and DT_CY is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: The column "TransactionDate" can't be inserted because the conversion between types DT_DATE and DT_DBTIMESTAMP is not supported.

Error at Data Flow Task [SQL Server Destination [4395]]: Column "Narration" cannot convert between unicode and non-unicode string data types.

Error at Data Flow Task [DTS.Pipeline]: "component "SQL Server Destination" (4395)" failed validation and returned validation status "VS_ISBROKEN".

Error at Data Flow Task [DTS.Pipeline]: One or more component failed validation.

Error at Data Flow Task: There were errors during task validation.

(Microsoft.DataTransformationServices.VsIntegration)


BUTTONS:

OK

|||

In SQL Server 2000, DTS used to do implicit conversions in many circumstances. The decision was made to eliminate that risky guessing in Integration Services, and to compel all data type conversions, truncations, etc. to be made explicitly.

Note that if you use the Import and Export Wizard to create your "base" package, the wizard creates all these tedious conversion transformations for you.

-Doug

|||

I'm getting the '...can't convert unicode...' conversion errors for strings imported from a table in Access 97.

I see that the Import Wizard does it's job and will successfully import the data into a new table.

But if you then change the destination to a table that already exists (created by a db copy from SQL2k) then the wizard generated package fails with the unicode conversion error again.

I've changed the destination column properties to match those created in the new table by the wizard - and it still fails.

I suspect that I could get the package copy the import into my table by doing an INSERT and then drop the table that the wizard creates. But I'm sure there's a more direct way than this.

I suspect I'm overlooking something obvious - but it is proving difficult to manage data between Microsoft family applications.

Any help / links appreciated.

|||I have the same error trying to import an Access database with a memo field type. SQL Server 2005 SSIS can't convert the memo to a varchar and gives the unicode error. Odd.|||

I have found the same problem connecting to a Lotus Notes database via the NotesSQL 3.02g ODBC driver.

Consider the following article on unicode.

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/acdata/ac_8_con_03_6voh.asp

When I changed the datatypes of my destination table from varchar to nvarchar, I no longer received the "cannot convert between unicode and non-unicode" error.

|||To be clear: String conversion errors can be resolved by changing char columns to nchar and varchar columns to nvarchar in SQL2005.|||

To be really clear, conversion errors happen because we use types inconsistently. There are unfortunately some instances when the type is forced upon us, but it is easy enough to change types inside SSIS. Changing existing database structures to simplify your ETL is not a great justification, alebit if possible it may make life easier. So in summary changing SQL Server column types will work, but so will converting the data within the pipeline, see the Data Conversion transform or the Derived Column transform.

|||

Well, I guess just another in a long line of folks hitting this trip wire. I'm trying to import an Excel file into an existing SQL table which as varchar columns, not nvchar. And, you're right, I'm not changing the table definition!

OK, so please forgive the basic nature of the question, but you say it is easy enough to change types in SSIS. So, you have to drag one of those transforms (Data Conversion or Derived Column) then onto your package?

I was just thinking that there ought to be a more native way to do this. But I guess this is it, then?

For example, I right-clicked by source data file and went to the Advanced Editor. There's a Input and Output Properties tab that shows Excel output in a tree view. When expanded, in turn, there are External and Output columns nodes.

I tried changing these to my desired data types, but it didn't seem to a) save my changes much less b) make the package work. So what's does one use the Input and Output Properties for?

Thanks for any information (or if you're up to adding an indepth article about it on your SQLIS.com website!). BOL does not seem to offer any meaningful information here.

|||

External columns are what the source actually contains. Output columns are what SSIS gives you downstream. You cannot change the types of either because the external types are decided by the underlying Excel provider not you, and you cannot change the output columns because they should match the external columns. External vs output is a bit like which columns have you selected to bring through to SSIS out of those available. So changing types as part of your column choice does not make sense, it would only lead to problems and failures.

The lack of implicit conversion is annoying for most people, but you can workaround. If implicit conversions take place and they are wrong, then you are generally stuck. The strong typing of SSIS is better in the long run I firmly believe, although I do often think it would be nice the other way too. Think of the way strings can be interpreted as dates. How may times have you hot problems with MD vs DM (if you in the UK all the time!). VBScript was my favourite as that would convert dates both ways depending on what the value was, all within the same function! You had no control and now idea which way it would go. Hence my assertion that not having implicit conversions are better. On the other hand some more help could be given in the product, such as selectively allowing similar types to be converted, STR to WSTR and I2 to I4. There will still be issues no doubt for more international organisations with multiple extended character sets or those using different code pages.

Data Conversion or Derived Column is the way to go I'm afraid.

|||Again thanks for the update.

So, I'm trying a simple test at this point. Starting with a brand new package, I set up an Excel source, then dragged a Derived Column transformation onto the Data Flow designer. I selected just two columns to test w/ by dragging them from the Columns pane down to the Expressions field, putting a TRIM() around 'em, entered a custom name for each in the Derived Column Name field, selecting <add as new column> and selecting string [DT_STR] in the Data Type field.

Then set up my SQL table as the destination and mapped the two derived columns to the corresponding columns in my table.

I'm still getting task validation errors.

"cannot convert between unicode and non-unicode string data types."

So, this makes no sense to me. In the derived transformation editor, I have the data types set to string; in my SQL table, the columns are varchar's. I'm just not sure what I else I should be doing...

Any suggestions would be welcome.

|||

Although you have selected the type of the output column to be DT_STR you haven't told SSIS how to convert the value into a DT_STR so in essence all you have done is define another implicit conversion.

Explicit conversions are done using type casts. The following expression will convert a column of type DT_WSTR into a DT_STR:

(DT_STR, <length>, 1252) (<column_name>)

Hope that helps.

-Jamie

|||

I am experiencing a similar error using the Microsoft OLEDB provider for oracle as a source. All data types are defined as nvarchar2 in Oracle and exist in a sql server 2000 db typed as nvarchar. Since these are both unicode types, why am I getting the same error?

I did find that the .net oracle provider fixed the unicode/non unicode error however the size of the source field is 1.5 times in the destination. The package will run but I get truncation warnings throughout execution.

Any feedback on these is greatly appreciated!

Thanks,

Barbara

|||

I am having the same problem with SQL Server as a source and destination. I am using a "DataReader Source" and specifying a SQL query. I then pass it through Merge Join and Conditional Split transforms. My destination is an OLE DB Destination. If I pick an existing destination table that has EXACTLY the same schema as the source table, the data flow fails with the same message everyone is getting above. Somewhere in the process the package is converting the source data from varchar to nvarchar. I can only get it to work if I have it create a new table in the OLE DB Destination, in which case it gives me nvarchar columns (identical schema as source except it changes the varchars to nvarchars).

Where in the package is it converting from varchar to nvarchar? It this a known issue? I can tell you with certainty that the source fields are not unicode.