Showing posts with label application. Show all posts
Showing posts with label application. Show all posts

Friday, March 30, 2012

Query help

This group is always awesome. Thanks in advance for your help. I need to
put together some recordsets for a charting application. I think this is
everything (ddl, data, expected result set.) Thanks again.
CREATE TABLE [dbo].[Tracking] (
[key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL )
insert into tracking VALUES ('LOCATION','Bedroom')
insert into tracking VALUES ('LOCATION','Dining Room')
insert into tracking VALUES ('LOCATION','Bathroom')
insert into tracking VALUES ('LOCATION','courtyard')
insert into tracking VALUES ('EVENT','verbal aggression')
insert into tracking VALUES ('EVENT','peer')
insert into tracking VALUES ('EVENT','bad behavior')
insert into tracking VALUES ('EVENT','other')
CREATE TABLE [dbo].[Tracking_DATA] (
[ID_] [int] IDENTITY (1, 1) NOT NULL ,
[bts_ID_] [int] NULL ,
[key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
insert into tracking_DATA VALUES (1, 'LOCATION', 'Bedroom')
insert into tracking_DATA VALUES (1, 'EVENT', 'other')
insert into tracking_DATA VALUES (1, 'EVENT', 'bad behavior')
insert into tracking_DATA VALUES (2, 'LOCATION', 'Bedroom')
insert into tracking_DATA VALUES (2, 'EVENT', 'other')
insert into tracking_DATA VALUES (2, 'EVENT', 'verbal aggression')
insert into tracking_DATA VALUES (3, 'LOCATION', 'courtyard')
insert into tracking_DATA VALUES (3, 'EVENT', 'other')
insert into tracking_DATA VALUES (3, 'EVENT', 'peer')
Ideally, the result set of the query would be
Location Event count (headings if possible)
Bedroom verbal aggression 1
Bedroom peer 0
Bedroom bad behavior 0
Bedroom other 2
Dining Room verbal aggression 0
Dining Room peer 0
Dining Room bad behavior 0
Dining Room other 0
Bathroom verbal aggression 0
Bathroom peer 0
Bathroom bad behavior 0
Bathroom other 0
courtyard verbal aggression 0
courtyard peer 1
courtyard bad behavior 0
courtyard other 1
Also, if possible, another query would return this result set. (I think I
know how to do this one.)
Location Event count
Bedroom verbal aggression 1
Bedroom other 2
courtyard peer 1
courtyard other 1Sorry, im stuck in your DDL, could you explain briefly what these tables are
for ?
I cant see the connection between the estimated result and the source data.
Jens SUessmeyer.
"Jack" <jack@.jack.net> schrieb im Newsbeitrag
news:ACqee.514$2J6.319@.lakeread06...
> This group is always awesome. Thanks in advance for your help. I need to
> put together some recordsets for a charting application. I think this is
> everything (ddl, data, expected result set.) Thanks again.
> CREATE TABLE [dbo].[Tracking] (
> [key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL )
> insert into tracking VALUES ('LOCATION','Bedroom')
> insert into tracking VALUES ('LOCATION','Dining Room')
> insert into tracking VALUES ('LOCATION','Bathroom')
> insert into tracking VALUES ('LOCATION','courtyard')
> insert into tracking VALUES ('EVENT','verbal aggression')
> insert into tracking VALUES ('EVENT','peer')
> insert into tracking VALUES ('EVENT','bad behavior')
> insert into tracking VALUES ('EVENT','other')
>
> CREATE TABLE [dbo].[Tracking_DATA] (
> [ID_] [int] IDENTITY (1, 1) NOT NULL ,
> [bts_ID_] [int] NULL ,
> [key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> ) ON [PRIMARY]
> GO
> insert into tracking_DATA VALUES (1, 'LOCATION', 'Bedroom')
> insert into tracking_DATA VALUES (1, 'EVENT', 'other')
> insert into tracking_DATA VALUES (1, 'EVENT', 'bad behavior')
> insert into tracking_DATA VALUES (2, 'LOCATION', 'Bedroom')
> insert into tracking_DATA VALUES (2, 'EVENT', 'other')
> insert into tracking_DATA VALUES (2, 'EVENT', 'verbal aggression')
> insert into tracking_DATA VALUES (3, 'LOCATION', 'courtyard')
> insert into tracking_DATA VALUES (3, 'EVENT', 'other')
> insert into tracking_DATA VALUES (3, 'EVENT', 'peer')
> Ideally, the result set of the query would be
> Location Event count (headings if possible)
> Bedroom verbal aggression 1
> Bedroom peer 0
> Bedroom bad behavior 0
> Bedroom other 2
> Dining Room verbal aggression 0
> Dining Room peer 0
> Dining Room bad behavior 0
> Dining Room other 0
> Bathroom verbal aggression 0
> Bathroom peer 0
> Bathroom bad behavior 0
> Bathroom other 0
> courtyard verbal aggression 0
> courtyard peer 1
> courtyard bad behavior 0
> courtyard other 1
> Also, if possible, another query would return this result set. (I think I
> know how to do this one.)
> Location Event count
> Bedroom verbal aggression 1
> Bedroom other 2
> courtyard peer 1
> courtyard other 1
>
>|||Sorry, im stuck in your DDL, could you explain briefly what these tables are
for ?
I cant see the connection between the estimated result and the source data.
Jens SUessmeyer.
"Jack" <jack@.jack.net> schrieb im Newsbeitrag
news:ACqee.514$2J6.319@.lakeread06...
> This group is always awesome. Thanks in advance for your help. I need to
> put together some recordsets for a charting application. I think this is
> everything (ddl, data, expected result set.) Thanks again.
> CREATE TABLE [dbo].[Tracking] (
> [key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL )
> insert into tracking VALUES ('LOCATION','Bedroom')
> insert into tracking VALUES ('LOCATION','Dining Room')
> insert into tracking VALUES ('LOCATION','Bathroom')
> insert into tracking VALUES ('LOCATION','courtyard')
> insert into tracking VALUES ('EVENT','verbal aggression')
> insert into tracking VALUES ('EVENT','peer')
> insert into tracking VALUES ('EVENT','bad behavior')
> insert into tracking VALUES ('EVENT','other')
>
> CREATE TABLE [dbo].[Tracking_DATA] (
> [ID_] [int] IDENTITY (1, 1) NOT NULL ,
> [bts_ID_] [int] NULL ,
> [key_] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [value_] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> ) ON [PRIMARY]
> GO
> insert into tracking_DATA VALUES (1, 'LOCATION', 'Bedroom')
> insert into tracking_DATA VALUES (1, 'EVENT', 'other')
> insert into tracking_DATA VALUES (1, 'EVENT', 'bad behavior')
> insert into tracking_DATA VALUES (2, 'LOCATION', 'Bedroom')
> insert into tracking_DATA VALUES (2, 'EVENT', 'other')
> insert into tracking_DATA VALUES (2, 'EVENT', 'verbal aggression')
> insert into tracking_DATA VALUES (3, 'LOCATION', 'courtyard')
> insert into tracking_DATA VALUES (3, 'EVENT', 'other')
> insert into tracking_DATA VALUES (3, 'EVENT', 'peer')
> Ideally, the result set of the query would be
> Location Event count (headings if possible)
> Bedroom verbal aggression 1
> Bedroom peer 0
> Bedroom bad behavior 0
> Bedroom other 2
> Dining Room verbal aggression 0
> Dining Room peer 0
> Dining Room bad behavior 0
> Dining Room other 0
> Bathroom verbal aggression 0
> Bathroom peer 0
> Bathroom bad behavior 0
> Bathroom other 0
> courtyard verbal aggression 0
> courtyard peer 1
> courtyard bad behavior 0
> courtyard other 1
> Also, if possible, another query would return this result set. (I think I
> know how to do this one.)
> Location Event count
> Bedroom verbal aggression 1
> Bedroom other 2
> courtyard peer 1
> courtyard other 1
>
>|||Thanks for helping me.
Basically, a script uses the 'tracking' table to build a web page. Each
key_ value is used to create a separate section of the form. The value_ is
used to create a checkbox that the user can select. The 'tracking_Data'
table contains the values that the user selected. I am using all of this
data to build a result set that will be used for a graphing tool.
"Jens Smeyer" <Jens@.Remove_this_For_Contacting.sqlserver2005.de> wrote in
message news:%23vt8ObYUFHA.2096@.TK2MSFTNGP14.phx.gbl...
> Sorry, im stuck in your DDL, could you explain briefly what these tables
> are for ?
> I cant see the connection between the estimated result and the source
> data.
> Jens SUessmeyer.
> "Jack" <jack@.jack.net> schrieb im Newsbeitrag
> news:ACqee.514$2J6.319@.lakeread06...
>|||This is a serioius heinous data structure which is why querying for the data
you
want is so difficult.
Here's a solution:
Select Locations.LocationValue, Events.EventValue
, (
Select Count(*)
From (
Select LocationData.LocationValue, EventData.EventValue
From (
Select TD1.Bts_id, TD1.[value] As LocationValue
From dbo.Tracking_Data As TD1
Where TD1.[key] = 'LOCATION'
) As LocationData
Join (
Select TD2.Bts_id, TD2.[value] As EventValue
From dbo.Tracking_Data As TD2
Where TD2.[key] = 'EVENT'
) As EventData
On LocationData.Bts_Id = EventData.Bts_Id
) As CollatedEventData
Where CollatedEventData.LocationValue = Locations.LocationValue
And CollatedEventData.EventValue = Events.EventValue
)
From (
Select T1.[value] As LocationValue
From dbo.Tracking As T1
Where T1.[key] = 'LOCATION'
) As Locations
, (
Select T2.[value] As EventValue
From dbo.Tracking As T2
Where T2.[key] = 'EVENT'
) As Events
Order By Locations.LocationValue, Events.EventValue
Let me make a few suggestions:
1. The reason that I had to use so many subqueries is that those entities ar
e
all lopped into the same table. There should be separate tables for Location
s
and Events.
2. The column names are seriously painful. Beyond the fact that I personally
hate underscores in column names, using underscores at the end of the column
name is really non-intuitive. I removed them for my example and came across
the
next column name faux pas. Don't use "key" and "value" for column names. It
means that the developer *has* surround the column name with square brackets
for
everything which is a serious pain.
HTH
Thomas|||Whew, Hats off for your query, i sure dont know if that works, but reading
the structure of the tables also gave me the creeps.
@.OriginalPoster: Perhaps you have the possibility to redesign your database
structure. as time goes by you will run in more than one issue with this
DB-structure. Yo if you have the chance and the power to makes some design
chances this would make life easier to you in the future. (...just my two
pence...)
Jens Suessmeyer.
"Thomas Coleman" <replyingroup@.anywhere.com> schrieb im Newsbeitrag
news:e5qhKGZUFHA.2520@.TK2MSFTNGP09.phx.gbl...
> This is a serioius heinous data structure which is why querying for the
> data you want is so difficult.
> Here's a solution:
> Select Locations.LocationValue, Events.EventValue
> , (
> Select Count(*)
> From (
> Select LocationData.LocationValue, EventData.EventValue
> From (
> Select TD1.Bts_id, TD1.[value] As LocationValue
> From dbo.Tracking_Data As TD1
> Where TD1.[key] = 'LOCATION'
> ) As LocationData
> Join (
> Select TD2.Bts_id, TD2.[value] As EventValue
> From dbo.Tracking_Data As TD2
> Where TD2.[key] = 'EVENT'
> ) As EventData
> On LocationData.Bts_Id = EventData.Bts_Id
> ) As CollatedEventData
> Where CollatedEventData.LocationValue = Locations.LocationValue
> And CollatedEventData.EventValue = Events.EventValue
> )
> From (
> Select T1.[value] As LocationValue
> From dbo.Tracking As T1
> Where T1.[key] = 'LOCATION'
> ) As Locations
> , (
> Select T2.[value] As EventValue
> From dbo.Tracking As T2
> Where T2.[key] = 'EVENT'
> ) As Events
> Order By Locations.LocationValue, Events.EventValue
> Let me make a few suggestions:
> 1. The reason that I had to use so many subqueries is that those entities
> are all lopped into the same table. There should be separate tables for
> Locations and Events.
> 2. The column names are seriously painful. Beyond the fact that I
> personally hate underscores in column names, using underscores at the end
> of the column name is really non-intuitive. I removed them for my example
> and came across the next column name faux pas. Don't use "key" and "value"
> for column names. It means that the developer *has* surround the column
> name with square brackets for everything which is a serious pain.
>
> HTH
>
> Thomas
>|||Thanks for looking at this.
You wrote :
>The reason that I had to use so many subqueries is that those entities are
all lopped into the same table. There should be separate tables for
Locations
> and Events.
I agree with you, but the challenge I am having is I have to make this
application generic. The 'Locations' and 'Events' table may be called
something different for another customer. Plus, we will not know how many
different items that will get tracked. In my example, there are only two,
but in the real world, there could be thirty. That is why I have everything
in one table keyed by 'key_' (which I agree with you that the name should
change.)
I definitely need help with data structure stuff, so your suggestions are
welcome and appreciated to make this work.
"Thomas Coleman" <replyingroup@.anywhere.com> wrote in message
news:e5qhKGZUFHA.2520@.TK2MSFTNGP09.phx.gbl...
> This is a serioius heinous data structure which is why querying for the
> data you want is so difficult.
> Here's a solution:
> Select Locations.LocationValue, Events.EventValue
> , (
> Select Count(*)
> From (
> Select LocationData.LocationValue, EventData.EventValue
> From (
> Select TD1.Bts_id, TD1.[value] As LocationValue
> From dbo.Tracking_Data As TD1
> Where TD1.[key] = 'LOCATION'
> ) As LocationData
> Join (
> Select TD2.Bts_id, TD2.[value] As EventValue
> From dbo.Tracking_Data As TD2
> Where TD2.[key] = 'EVENT'
> ) As EventData
> On LocationData.Bts_Id = EventData.Bts_Id
> ) As CollatedEventData
> Where CollatedEventData.LocationValue = Locations.LocationValue
> And CollatedEventData.EventValue = Events.EventValue
> )
> From (
> Select T1.[value] As LocationValue
> From dbo.Tracking As T1
> Where T1.[key] = 'LOCATION'
> ) As Locations
> , (
> Select T2.[value] As EventValue
> From dbo.Tracking As T2
> Where T2.[key] = 'EVENT'
> ) As Events
> Order By Locations.LocationValue, Events.EventValue
> Let me make a few suggestions:
> 1. The reason that I had to use so many subqueries is that those entities
> are all lopped into the same table. There should be separate tables for
> Locations and Events.
> 2. The column names are seriously painful. Beyond the fact that I
> personally hate underscores in column names, using underscores at the end
> of the column name is really non-intuitive. I removed them for my example
> and came across the next column name faux pas. Don't use "key" and "value"
> for column names. It means that the developer *has* surround the column
> name with square brackets for everything which is a serious pain.
>
> HTH
>
> Thomas
>|||There is such a thing as "too" generic. There has to be some structure or
everything becomes nothing more than a couple of tables called "things". The
real key (no pun intended) is commonality. Is there a pattern to the data th
at
they want to store? It may not be possible to create one structure to rule t
hem
all and in the darkness bind them.
Thomas
"Jack" <jack@.jack.net> wrote in message news:DLsee.522$2J6.105@.lakeread06...
> Thanks for looking at this.
> You wrote :
> all lopped into the same table. There should be separate tables for Locati
ons
> I agree with you, but the challenge I am having is I have to make this
> application generic. The 'Locations' and 'Events' table may be called
> something different for another customer. Plus, we will not know how man
y
> different items that will get tracked. In my example, there are only two
,
> but in the real world, there could be thirty. That is why I have everythi
ng
> in one table keyed by 'key_' (which I agree with you that the name should
> change.)
> I definitely need help with data structure stuff, so your suggestions are
> welcome and appreciated to make this work.
> "Thomas Coleman" <replyingroup@.anywhere.com> wrote in message
> news:e5qhKGZUFHA.2520@.TK2MSFTNGP09.phx.gbl...
>|||>> the challenge I am having is I have to make this application
generic. <<
ARRRRGHH!! There is no such thing in RDBMS. You have missed all the
foundations. The idea is that you have a known data model of an
external reality that you manipulate. Users do not get to make
elephants fall out of the sky.
for another customer. <<
Then yuou would them appropriately, not generically!!
tracked. <<
Then you have no done your job as a data modeler, have you?
agree with you that the name should change.) <
"To be is to be something in particular; to be nothing in particular is
to be nothing." --Aristole
The name of this design flaw is EAV (entity-attribute-value). It falls
apart in a year in production work two ways:
1) The queries look like the nightmare that Mr. Coleman posted only
*orders of magnitude* -- did you hear that' repeat *orders of
magnitude* -- more complex and slower. The overhead of type conversion
alone will kill a real application. Or did you write an entire
temporal and arithmetic library for VARCHAR(255) strings that does not
use CAST()?
2) All data integrity is destroyed. Any typo becomes a new attribute
or entity. Entities are found missing attributes, so all the reports
are wrong.
3) Try to write a single CHECK() constraint that works for all the
attributes of those 30+ entities your users created because you were
too dumb or too lazy to do your job. It can be done! You need a case
expression almost 70 WHEN clauses for a simple invoice and order system
when I tried it as an exercise.
4) Try to write a single DEFAULT clause for 30+ entities crammed into
one column. Impossible!
5) Try to set up DRI actions among the entities. If you thought the
WHEN clauses in the single CASE expression were unmaintainable, wait
until you see the "TRIGGERs from Hell" -- Too bad that they might not
fit into older SQL Server which had some size limits. Now maintain it.
I bill $1000-$2000 per day to look at and fix this exact design
problem. Let me know if your boss wants to pay me, fix the problem in
shop or go out of business. It is that kind of mistake.sql

Monday, March 26, 2012

Query Governer

Hi,
Can I define limitation on consuming CPU/IO for specific database user ?
We have an application that it's database resource consumption should be
restricted.
I can't use query governor because it puts global restriction for all users
(am i correct?)
Thanks,
AllanNo, sorry.
"A.M" <IHateSpam@.sapm123.com> wrote in message
news:OczCnokIEHA.3664@.TK2MSFTNGP11.phx.gbl...
> Hi,
> Can I define limitation on consuming CPU/IO for specific database user ?
> We have an application that it's database resource consumption should be
> restricted.
> I can't use query governor because it puts global restriction for all
users
> (am i correct?)
> Thanks,
> Allan
>

Query Governer

Hi,
Can I define limitation on consuming CPU/IO for specific database user ?
We have an application that it's database resource consumption should be
restricted.
I can't use query governor because it puts global restriction for all users
(am i correct?)
Thanks,
AllanNo, sorry.
"A.M" <IHateSpam@.sapm123.com> wrote in message
news:OczCnokIEHA.3664@.TK2MSFTNGP11.phx.gbl...
> Hi,
> Can I define limitation on consuming CPU/IO for specific database user ?
> We have an application that it's database resource consumption should be
> restricted.
> I can't use query governor because it puts global restriction for all
users
> (am i correct?)
> Thanks,
> Allan
>

Query Governer

Hi,
Can I define limitation on consuming CPU/IO for specific database user ?
We have an application that it's database resource consumption should be
restricted.
I can't use query governor because it puts global restriction for all users
(am i correct?)
Thanks,
Allan
No, sorry.
"A.M" <IHateSpam@.sapm123.com> wrote in message
news:OczCnokIEHA.3664@.TK2MSFTNGP11.phx.gbl...
> Hi,
> Can I define limitation on consuming CPU/IO for specific database user ?
> We have an application that it's database resource consumption should be
> restricted.
> I can't use query governor because it puts global restriction for all
users
> (am i correct?)
> Thanks,
> Allan
>
sql

Friday, March 23, 2012

Query for finding trailing spaces in a column

It seems like our application inserting trailing spaces
into the varchar field (spaces before and after the
string). Can anyone help me with a query to find out which
rows have trailing spaces in a column ?
Thanks for any help......
SET NOCOUNT ON
CREATE TABLE #splunge
(
foo VARCHAR(10)
)
GO
INSERT #splunge SELECT 'val1'
INSERT #splunge SELECT 'val2 '
INSERT #splunge SELECT ' val3'
INSERT #splunge SELECT ' val4 '
GO
-- leading spaces:
SELECT * FROM #splunge WHERE LTRIM(foo) != foo
-- trailing spaces:
SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!=DATALENGTH(foo)
GO
DROP TABLE #splunge
GO
http://www.aspfaq.com/
(Reverse address to reply.)
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column ?
> Thanks for any help......
|||Thanks........

>--Original Message--
>SET NOCOUNT ON
>CREATE TABLE #splunge
>(
> foo VARCHAR(10)
>)
>GO
>INSERT #splunge SELECT 'val1'
>INSERT #splunge SELECT 'val2 '
>INSERT #splunge SELECT ' val3'
>INSERT #splunge SELECT ' val4 '
>GO
>-- leading spaces:
>SELECT * FROM #splunge WHERE LTRIM(foo) != foo
>-- trailing spaces:
>SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!
=DATALENGTH(foo)
>GO
>DROP TABLE #splunge
>GO
>--
>http://www.aspfaq.com/
>(Reverse address to reply.)
>
>
>"John" <anonymous@.discussions.microsoft.com> wrote in
message[vbcol=seagreen]
>news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
which
>
>.
>
|||select * from TheTable where theColumn like '% '
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column ?

Query for finding trailing spaces in a column

It seems like our application inserting trailing spaces
into the varchar field (spaces before and after the
string). Can anyone help me with a query to find out which
rows have trailing spaces in a column '
Thanks for any help......SET NOCOUNT ON
CREATE TABLE #splunge
(
foo VARCHAR(10)
)
GO
INSERT #splunge SELECT 'val1'
INSERT #splunge SELECT 'val2 '
INSERT #splunge SELECT ' val3'
INSERT #splunge SELECT ' val4 '
GO
-- leading spaces:
SELECT * FROM #splunge WHERE LTRIM(foo) != foo
-- trailing spaces:
SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!=DATALENGTH(foo)
GO
DROP TABLE #splunge
GO
--
http://www.aspfaq.com/
(Reverse address to reply.)
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column '
> Thanks for any help......|||Thanks........
>--Original Message--
>SET NOCOUNT ON
>CREATE TABLE #splunge
>(
> foo VARCHAR(10)
>)
>GO
>INSERT #splunge SELECT 'val1'
>INSERT #splunge SELECT 'val2 '
>INSERT #splunge SELECT ' val3'
>INSERT #splunge SELECT ' val4 '
>GO
>-- leading spaces:
>SELECT * FROM #splunge WHERE LTRIM(foo) != foo
>-- trailing spaces:
>SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!
=DATALENGTH(foo)
>GO
>DROP TABLE #splunge
>GO
>--
>http://www.aspfaq.com/
>(Reverse address to reply.)
>
>
>"John" <anonymous@.discussions.microsoft.com> wrote in
message
>news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
>> It seems like our application inserting trailing spaces
>> into the varchar field (spaces before and after the
>> string). Can anyone help me with a query to find out
which
>> rows have trailing spaces in a column '
>> Thanks for any help......
>
>.
>|||select * from TheTable where theColumn like '% '
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column '

Query for finding trailing spaces in a column

It seems like our application inserting trailing spaces
into the varchar field (spaces before and after the
string). Can anyone help me with a query to find out which
rows have trailing spaces in a column '
Thanks for any help......SET NOCOUNT ON
CREATE TABLE #splunge
(
foo VARCHAR(10)
)
GO
INSERT #splunge SELECT 'val1'
INSERT #splunge SELECT 'val2 '
INSERT #splunge SELECT ' val3'
INSERT #splunge SELECT ' val4 '
GO
-- leading spaces:
SELECT * FROM #splunge WHERE LTRIM(foo) != foo
-- trailing spaces:
SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!=DATALENGTH(foo)
GO
DROP TABLE #splunge
GO
http://www.aspfaq.com/
(Reverse address to reply.)
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx
.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column '
> Thanks for any help......|||Thanks........

>--Original Message--
>SET NOCOUNT ON
>CREATE TABLE #splunge
>(
> foo VARCHAR(10)
> )
>GO
>INSERT #splunge SELECT 'val1'
>INSERT #splunge SELECT 'val2 '
>INSERT #splunge SELECT ' val3'
>INSERT #splunge SELECT ' val4 '
>GO
>-- leading spaces:
>SELECT * FROM #splunge WHERE LTRIM(foo) != foo
>-- trailing spaces:
>SELECT * FROM #splunge WHERE DATALENGTH(RTRIM(foo))!
=DATALENGTH(foo)
>GO
>DROP TABLE #splunge
>GO
>--
>http://www.aspfaq.com/
>(Reverse address to reply.)
>
>
>"John" <anonymous@.discussions.microsoft.com> wrote in
message
> news:2796201c4642a$94e8df30$a601280a@.phx
.gbl...
which[vbcol=seagreen]
>
>.
>|||select * from TheTable where theColumn like '% '
"John" <anonymous@.discussions.microsoft.com> wrote in message
news:2796201c4642a$94e8df30$a601280a@.phx
.gbl...
> It seems like our application inserting trailing spaces
> into the varchar field (spaces before and after the
> string). Can anyone help me with a query to find out which
> rows have trailing spaces in a column '

Wednesday, March 21, 2012

query for date range

I use oledb (ACCESS database) in my application. i want to build a query to retrieve the number of Bookings from my Booking table where the appointment_date_time.timeOfADay is in range of 9am-12pm or 14pm- 7pm, that is (>= 9 and <12) or (>= 14 and < 17). Please help to build the query,

I found some query sample like:

select * from tblstudents where classID='1' and studentstartdate between ('2004-12-03') and ('2004-12-12')

or

WHERE DateField BETWEEN @.StartDate AND @.EndDate

But I dont want to search year and month and day, i just want to search the actual hour of a day. i am stuck with the syntax, please help

Did you try the HOUR function. I will return you the value of the actual hours, this can be filtered like any integer column.


HTH, Jens K. Suessmeyer.


http://www.sqlserver2005.de

Query faster in Access than SQL Server

I have an VB application which i recently change from using an Access database to SQL Server. I upsized the Access Database to SQL. The query takes about 1 second to run when run from access. It takes about 1 minute to run from SQL server using query analyser or through my application. I would be greatful for suggestions. The query is:

SELECT *
FROM (((((((WorkLog AS W LEFT JOIN Grp AS G ON w.wrkgrpsysid = g.GrpSysId)
LEFT JOIN Register AS R ON w.wrkregSysId = r.regSysId)
LEFT JOIN Location AS L ON w.wrklocSysId = l.locSysId)
LEFT JOIN CostCenter AS C ON w.wrkcstSysId = c.cstSysId)
LEFT JOIN SubAssembly AS S ON w.wrksubsysid = s.subsysid)
LEFT JOIN Priority AS Pr ON w.wrkpriority = pr.prisysid)
LEFT JOIN Plann AS P ON w.wrkplnsysid = p.plnsysid)
Left join Route RT on w.wrkrtesysid=rt.rtesysid
Where (WrkType = 'PM' or (WrkType = 'PL') or (WrkType = 'UN'))What indexes do you have defined for these tables? One way to get a handle on what's going on is to use Query Analyzer to view the execution plan. Paste your query in to QA and on the menu select Query->Show Execution Plan. or hit Ctrl + K. When your query runs you will see what indexes are being ussed OR if a table scan is being performed.|||The indexes that are defined in these tables are the ones that have been upsized fromthe Access db. They are all primary indexes. There were no indexes for WrkType. I created some and the time did not improve.

Looking at the execution plan it appears that there are a lot of table scans going on and then results are joined, I cant see any reference to the indexes here. Should this be happening when i have indexes?|||if you have usable indexes, yes.

you could use the index wizard. In QU have your query ready to go, select Query -> Index Tunning Wizard. Just follow the steps and select thorough for the tuning mode. This will look at your query, tables, indexes and statistics. If more optimal indexes or statistics are needed you have an option of creating them or scripting them for later execution.

Try this and post back with questions.|||Paul,

I did as you said and it reported that no indexes are used in executing the query. It also reported that no changes were needed.

I cant think of anything other than the indexes that would be causing such a dramatic difference in the execution times. Would clustering some of the indexes help?

It seems strange considering that the DB has been upsized with all the indexes etc.. I presume that this is a common problem?|||I just converted my access database to SQL server (converting most queries to Stored Procedures) and have found that my queries run somewhat slower on my standalone system, but apparently execute much faster than access originally in our companies production system.|||Run UPDATE STATISTICS on your tables. I'm guessing your query plan was originally compiled when there was little or no data in your tables, hence the optimizer chose not to use the indexes.

blindman|||I can't imagine access handling anything better than SQL Server. I'd also say make sure your statistics are updated, and are set to automatically update. Also, make sure you have indexed fields in the most efficient ways. AND make sure your server is using all the processors it has available, etc. If there are table scans going on, then I would think that the indexes aren't set up properly.

I've tried to upsize db's from access before, and had nothing but problems. If I were you, I'd build the db like I wanted in in SQL Server, then export the Access data to a text file or something and Import it into SQL Server. That's proven the best way for me.|||Thanks everyone for your help so far.

Ive tried to see where delay is coming from. When i run the following query:

select WrkSysid
from worklog
where (wrktype = 'PM')

it does an index seek on the field 'wrktype' and an index scan on the field 'Wrksysid'. But when i run the following query:

select WrkSysid, WrkGrpSysid
from worklog
where (wrktype = 'PM')

it does a table scan. This is despite having indexes on all 3 fields. Ive run UPDATE STATISTICS on all the tables. Is this normal?|||I came across such an amount of problems, that I wrote and used my own conversion program.

Your query has a lot of joins; do you have a referential constraint between WorkLog.wrkgrpsysid and Grp.GrpSysID, for example ? If not, make sure that you have an index (simply with duplicates) on wrkgrpsysid and an index (preferabelly UNIQUE) on GrpSysID.|||Is the index a clustered index? Or do you have a seperate index defined for each field? It looks like you have seperate indexes on each field. If so, try using a clustered index containing both of those fields.|||Originally posted by AnSQLQuery
Is the index a clustered index? Or do you have a seperate index defined for each field? It looks like you have seperate indexes on each field. If so, try using a clustered index containing both of those fields.

There is plenty of noise in your remark, Query.

1) I guess you mean with "clustered index" one index on several fields, instead of two indices?! Such an indx does not have a special name.
2) You can't create an index on fields in two tables!
3) With a clustered index, you store the whole table according to the sort order of the index. The question, however, is to join related data to a WorkLog table, which shall be scanned fully without any ordering.

Depending on the selectivity, you may increase the speed by defining an index on the criteria field WrkType.|||1) I guess you mean with "clustered index" one index on several fields, instead of two indices?! Such an indx does not have a special name.

Doc,
This would be called a composite index.|||Thanks Rocket, do you also have a name for an index on just one field?|||Do you have a composite index on wrktype and wrkSysid? If so, your first query would execute solely on the index and never touch the actual table, making it very fast. Your second query would have to reference the table to pull in the WrkGrpSysid value.

Also, a lot depends upon the cardinality of your data, which describes how many unique values are in a given column or combination of columns. If the cardinality of wrktype is low (say there are only two values; "AM" and "PM") then the optimizer gains little or nothing from using an index on that column because it does not significantly reduce the number of pages it has to search through. It would thus ignore the index.

blindman

PS: Indexes on single columns are just called indexes.|||wow, next time i won't drink so much before posting...sql

Monday, March 12, 2012

Query Engine Error

Hi All,

I am trying to build an application in VS.Net which uses Crystal Report 11 to build the rpt file. I have an XML and XSD file as my datasources. I want to view the report in PDF format. Now my source code looks like:

ReportDocument doc = new ReportDocument();
string fileName = Server.MapPath("MyReport.rpt");
doc.Load(fileName);

DataSet ds = new DataSet();
ds.ReadXml(Server.MapPath("MyXML.xml"));

doc.SetDataSource(ds);

ExportOptions exportOpts = doc.ExportOptions;
exportOpts.ExportFormatType = ExportFormatType.PortableDocFormat;
exportOpts.ExportDestinationType = ExportDestinationType.DiskFile;
exportOpts.DestinationOptions = new DiskFileDestinationOptions();

DiskFileDestinationOptions diskOpts = new DiskFileDestinationOptions();
((DiskFileDestinationOptions)doc.ExportOptions.DestinationOptions).DiskFileName = Server.MapPath("MyPDF.pdf");

doc.Export();

But it gives me the error
"Query Engine Error: 'c:\inetpub\wwwroot\MyWebsite\MyReport.rpt'"

I tried out all the forums but could not find any solution. Someone told me that it is due to xsd file error. But in Crystal Report it is running perfectly OK. Can anyone help.did you figure out what happened? I am getting this error during my deployment, but on my local PC it works fine.

bev

query distribution using service broker

Hello ,

I am wokring on an application where in I am migrating the legacy application to SQL server. for this when ever I make any changes to the legacy application,I immediately move this changes to SQL, using insert/delete/update to SQL.

I also want all these data on 2 to 3 other SQL servers which are on different remote machines. for this i thought of using the service broker to send the query as messages.

So on the initiator(where the legacy application reside), i create 2 to 3 queues(based upon number of receivers) and each time a query is successful on the initiator, i move these messages(queries) to the initiator queues and send them to the various targets. on the target i just pop these messages and execute them so that the query run on the all the targets and all my databases are in synch.

I need some info on various operations on the Service broker queues like:

1.How do i insert/update/delete from the queue.
2.Can i set some size constraint on the queue
3.Can i get the info like when the queue is full.
4.Can i set some custom flags on the queue like, queue is invalid or valid(setting and getting these falgvalues).
5.Clearing the entire queue.

These are the operations, i need on the queue for various use cases(failure cases)

Can you point to some documentation/tutorials whihc sheds more light on these queue operations.
Thanks,

1. Unlike other messaging products like MSMQ, MQSeries, etc, Service Broker doesn't allow you to enqueue directly to queues. Instead, it provides reliable communication via dialogs between services. Queues are just backing stores used by services to store incoming messages until they are processed. But the targettable endpoints are Services and the primitive for communication is a dialog or conversation.

In your application, you could create one initiator service from where you will send the data, one intermediate service which would receive the data and in turn send copies to each of the target services. The target services would reside in each of your databases on the different SQL Server instances.

2. Service Broker Queues do not have size constraints and therefore there is no such thing as queue is full. Like database tables, they can grow logically up to infinity or physically up to the size of the physical media they reside on.

3. You can enable/disable the queue which causes the service to accept/reject inbound messages.

4. You cannot clear the queue in a single operation. But you could iterate over all the conversation endpoins and end them causing messages associated with those conversations to be deleted from the queue.

Books Online (the SQL Server documentation) and Roger Wolter's book are good resources to learn about Service Broker. Also look at my links on our homepage.

Wednesday, March 7, 2012

Query Compilation Plans

Hi all,
We have a third party application that makes extensive use of
sp_executesql which allows it to reuse compilation plans for the sql
queries. Now, I am trying to monitor what the application is doing using the
Profiler. Since the application reuses compilation plans, often I am not
able to see the actual queries in the profile. So I would like to
temporarilly force recompilation of all queries executed by the application.
I don't know how to do it. So please help me and tell me how to do this in
Query Analyzer; I don't have access to Enterprise Manager.
Thanks in advance.> Since the application reuses compilation plans, often I am not able to see
the
> actual queries in the profile.
You should be able to see statement sent to sql server, unless the client
app is using sp_prepare / sp_execute, in that case you will see sp
_execute and parameters.
AMB
"helpful sql" wrote:

> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using t
he
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the applicatio
n.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>|||try
dbcc freeproccache
for removing all the execution plans from the cache|||Yes, the application is using sp_prepare/sp_execute. Is there a way to see
the actual queries used in sp_prepare? I did not capture them in the
Profiler.
Thanks.
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...[vbcol=seagreen]
> the
> You should be able to see statement sent to sql server, unless the client
> app is using sp_prepare / sp_execute, in that case you will see sp
> _execute and parameters.
>
> AMB
> "helpful sql" wrote:
>|||helpful sql,
Try looking inside the system table syscacheobjects.
select
objtype,
sql
from
master.dbo.syscacheobjects
where
objtype = 'Prepared'
go
syscacheobjects
http://msdn.microsoft.com/library/d...br />
3dyr.asp
AMB
"helpful sql" wrote:

> Yes, the application is using sp_prepare/sp_execute. Is there a way to see
> the actual queries used in sp_prepare? I did not capture them in the
> Profiler.
> Thanks.
> "Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in messag
e
> news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...
>
>|||Lookup fn_get_sql() in BOL.
"helpful sql" wrote:

> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using t
he
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the applicatio
n.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>

Query Compilation Plans

Hi all,
We have a third party application that makes extensive use of
sp_executesql which allows it to reuse compilation plans for the sql
queries. Now, I am trying to monitor what the application is doing using the
Profiler. Since the application reuses compilation plans, often I am not
able to see the actual queries in the profile. So I would like to
temporarilly force recompilation of all queries executed by the application.
I don't know how to do it. So please help me and tell me how to do this in
Query Analyzer; I don't have access to Enterprise Manager.
Thanks in advance.> Since the application reuses compilation plans, often I am not able to see
the
> actual queries in the profile.
You should be able to see statement sent to sql server, unless the client
app is using sp_prepare / sp_execute, in that case you will see sp
_execute and parameters.
AMB
"helpful sql" wrote:
> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using the
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the application.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>|||try
dbcc freeproccache
for removing all the execution plans from the cache|||Yes, the application is using sp_prepare/sp_execute. Is there a way to see
the actual queries used in sp_prepare? I did not capture them in the
Profiler.
Thanks.
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...
>> Since the application reuses compilation plans, often I am not able to
>> see
> the
>> actual queries in the profile.
> You should be able to see statement sent to sql server, unless the client
> app is using sp_prepare / sp_execute, in that case you will see sp
> _execute and parameters.
>
> AMB
> "helpful sql" wrote:
>> Hi all,
>> We have a third party application that makes extensive use of
>> sp_executesql which allows it to reuse compilation plans for the sql
>> queries. Now, I am trying to monitor what the application is doing using
>> the
>> Profiler. Since the application reuses compilation plans, often I am not
>> able to see the actual queries in the profile. So I would like to
>> temporarilly force recompilation of all queries executed by the
>> application.
>> I don't know how to do it. So please help me and tell me how to do this
>> in
>> Query Analyzer; I don't have access to Enterprise Manager.
>> Thanks in advance.
>>|||helpful sql,
Try looking inside the system table syscacheobjects.
select
objtype,
sql
from
master.dbo.syscacheobjects
where
objtype = 'Prepared'
go
syscacheobjects
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_sys-c_3dyr.asp
AMB
"helpful sql" wrote:
> Yes, the application is using sp_prepare/sp_execute. Is there a way to see
> the actual queries used in sp_prepare? I did not capture them in the
> Profiler.
> Thanks.
> "Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
> news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...
> >> Since the application reuses compilation plans, often I am not able to
> >> see
> > the
> >> actual queries in the profile.
> >
> > You should be able to see statement sent to sql server, unless the client
> > app is using sp_prepare / sp_execute, in that case you will see sp
> > _execute and parameters.
> >
> >
> > AMB
> >
> > "helpful sql" wrote:
> >
> >> Hi all,
> >> We have a third party application that makes extensive use of
> >> sp_executesql which allows it to reuse compilation plans for the sql
> >> queries. Now, I am trying to monitor what the application is doing using
> >> the
> >> Profiler. Since the application reuses compilation plans, often I am not
> >> able to see the actual queries in the profile. So I would like to
> >> temporarilly force recompilation of all queries executed by the
> >> application.
> >> I don't know how to do it. So please help me and tell me how to do this
> >> in
> >> Query Analyzer; I don't have access to Enterprise Manager.
> >>
> >> Thanks in advance.
> >>
> >>
> >>
>
>|||Lookup fn_get_sql() in BOL.
"helpful sql" wrote:
> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using the
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the application.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>

Query Compilation Plans

Hi all,
We have a third party application that makes extensive use of
sp_executesql which allows it to reuse compilation plans for the sql
queries. Now, I am trying to monitor what the application is doing using the
Profiler. Since the application reuses compilation plans, often I am not
able to see the actual queries in the profile. So I would like to
temporarilly force recompilation of all queries executed by the application.
I don't know how to do it. So please help me and tell me how to do this in
Query Analyzer; I don't have access to Enterprise Manager.
Thanks in advance.> Since the application reuses compilation plans, often I am not able to see
the
> actual queries in the profile.
You should be able to see statement sent to sql server, unless the client
app is using sp_prepare / sp_execute, in that case you will see sp
_execute and parameters.
AMB
"helpful sql" wrote:

> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using t
he
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the applicatio
n.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>|||try
dbcc freeproccache
for removing all the execution plans from the cache|||Yes, the application is using sp_prepare/sp_execute. Is there a way to see
the actual queries used in sp_prepare? I did not capture them in the
Profiler.
Thanks.
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...
> the
> You should be able to see statement sent to sql server, unless the client
> app is using sp_prepare / sp_execute, in that case you will see sp
> _execute and parameters.
>
> AMB
> "helpful sql" wrote:
>|||helpful sql,
Try looking inside the system table syscacheobjects.
select
objtype,
sql
from
master.dbo.syscacheobjects
where
objtype = 'Prepared'
go
syscacheobjects
http://msdn.microsoft.com/library/d...br />
3dyr.asp
AMB
"helpful sql" wrote:

> Yes, the application is using sp_prepare/sp_execute. Is there a way to see
> the actual queries used in sp_prepare? I did not capture them in the
> Profiler.
> Thanks.
> "Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in messag
e
> news:10C6CECD-CD3B-458F-8171-1337EBC78B0A@.microsoft.com...
>
>|||Lookup fn_get_sql() in BOL.
"helpful sql" wrote:

> Hi all,
> We have a third party application that makes extensive use of
> sp_executesql which allows it to reuse compilation plans for the sql
> queries. Now, I am trying to monitor what the application is doing using t
he
> Profiler. Since the application reuses compilation plans, often I am not
> able to see the actual queries in the profile. So I would like to
> temporarilly force recompilation of all queries executed by the applicatio
n.
> I don't know how to do it. So please help me and tell me how to do this in
> Query Analyzer; I don't have access to Enterprise Manager.
> Thanks in advance.
>
>

query cannot return any results

I have a very uncommon problem ... I have a application that runs like this:
1) it receives users inserts, including a Status = 'U' field;
2) based on a Status field (index) the application query and select the last
inserted registries all day long, each 30 seconds;
3) every time it read the registries it changes the Status field to 'R'.
The problem is that after about 24 hours, the query identifies no longer
registries with that index. It returns nothing. I use a execurereader
command, and it happens that myreader.hasrows = false, even if there are row
s
with Status field = 'U'
Can somebody help me to know what is happening?
--
Sergio R Piresquery optimiser should use best available index or column stats to decide
strategy, and this may be cached for long period.
Maybe stats get recomputed automatically if you make lotsa changes and have
updatestats dboption set, or explicitly by your DBA [recommendation used to
be to do explicitly due to excessive overhead but nowadays with autonomics
MSSQL does the right thing].
If you truncate/delete staging table [daily] just before query is compiled
into cache it may decide to use tablescan even if index available [since so
_few_ rows] and this may persist some time even if cardinality builds up a
lot.
Unfortunately the sysindexes.rowcnt cannot be relied on for accuracy [due to
transaction activity], so you may have to force count(*) to get real count
but this has locking issues [nolock would only give approx count like
sysindexes].
Dependencies can be omitted from sysdepends [to support forward compilation]
so optimiser may be similarly ignorant.
I suspect the optimiser is getting , so suggest that
1. check latest Service Pack applied
2. exec sp_dboption 'pubs','auto create statistics','on' -- substitute
dbname for pubs
3. exec sp_dboption 'pubs','auto update statistics','on' -- substitute
dbname for pubs
4. use QA to show query plan [Control-L]
5. try explicit sp_recompile
6. check dependencies
if all else fails you can mark your sproc "WITH RECOMPILE" to ignore cached
copy, thus keep abreast of actual cardinality
best wishes!
Dick
"Sergio R Pires" wrote:

> I have a very uncommon problem ... I have a application that runs like thi
s:
> 1) it receives users inserts, including a Status = 'U' field;
> 2) based on a Status field (index) the application query and select the la
st
> inserted registries all day long, each 30 seconds;
> 3) every time it read the registries it changes the Status field to 'R'.
> The problem is that after about 24 hours, the query identifies no longer
> registries with that index. It returns nothing. I use a execurereader
> command, and it happens that myreader.hasrows = false, even if there are r
ows
> with Status field = 'U'
> Can somebody help me to know what is happening?
> --
> Sergio R Pires

Saturday, February 25, 2012

Query by database or code

I have a database and would like to retrieve specific data via queries. This database is also connected to an ASP .Net 2.0 application to be the front end. Ive created the query in the database. Would you recommend i use parameter names to retrieve the data via code or should i have the query within my code to retrieve the data?

Thanks

I'm not exactly sure what you are asking. Are you asking whether or not to do your database access from the ASPX page or the code-behind? Or, are you asking if you should parameterize your queries?

|||

Im using ASP.Net to create a website and heres and example of how i retrieve the data:

Dim strCommandText as String= "SELECT * FROM Player INNER JOIN Manufacturer ON Player.PlayerManufacturerID =
Manufacturer.ManufacturerID ORDER BY Player.PlayerName"
Dim myCommand As SqlCommand = new SqlCommand(strCommandText, myConnection)

Now above i write the SQL syntax into my application to retrieve data. If i have a query created within the DATABASE called ParameterName then i can write the code as:

MyCommand.Paramters.Add(ParameterName)

So my question is really asking which method is approved and why as theyre both doing the same thing?

Thanks

|||

If I still misunderstood your question, then my apologies, but I think what you're asking is if your Select statement requires parameters, then should I inject them directly into my statement, or use a parameter, and add the parameters afterwards. If this is what you're truly asking, then I'd definetely suggest the use of parameters as they create a much less error-proned design. Here's an article on the subject:

http://www.4guysfromrolla.com/webtech/092601-1.shtml

|||

The "common" best practice approach would be to have a separate class library that will return the "Players" via a stored procedure. I would also recommend not using SELECT *, you should only select the fields you need.

|||

If I still misunderstood your question, then my apologies, but I think what you're asking is if your Select statement requires parameters, then should I inject them directly into my statement, or use a parameter, and add the parameters afterwards. If this is what you're truly asking, then I'd definetely suggest the use of parameters as they create a much less error-proned design. Here's an article on the subject:

http://www.4guysfromrolla.com/webtech/092601-1.shtml

Not quite. Im just trying to see if i should use a Select query in my code or should i call aparameter Stored Procedure from the database to retrieve the data. Sorry my fault i was using the wrong terminology, i should have said Stored Procedure but got my wires crossed as ASP .Net terms this as a parameter being passed in to the code your writing

jguadagno:

The "common" best practice approach would be to have a separate class library that will return the "Players" via a stored procedure. I would also recommend not using SELECT *, you should only select the fields you need.

I think this may have answered the question. Where can i get started with Class libraries? also thanks for advising not to use SELECT * - i only used that in this question as an example.

Thanks guys

|||

EssCee:

I think this may have answered the question. Where can i get started with Class libraries?

http://www.15seconds.com/issue/050721.htm