Monday, March 19, 2012

Another Temporary Table Q

Hi
I've been playing around with temporary tables ... everything works fine,
but I've read a few articles saying how they can affect overall performance
and I'm wondering if there may be a faster alternative performance.
Basically, the user fires off a query at the DB. With the results, I need to
delete some rows which are retrieved as well (they cannot be filtered out as
part of the SELECT statement because of the effects of LEFT JOINS) as amend
certain data values prior to returning the data to the user. I thought
temporary tables would be the way, so I could just
1) Create Temp Table
2) Populate with INSERT
3) DELETE rows fulfilling certain criteria
4) UPDATE certain data values
5) Return contents of Temporary Table to user as ADO dataset.
I'm wondering now if I could actually get all the Data back from the results
of the initial SELECT statement to the local machine fast, and then run the
DELETE/UPDATES locally so as to remove load from the central server. I've
done timings already comparing Temporary Tables to cycling through the raw
dataset locally and deleting rows and/or updating Data Values. Using
temporary tables is much faster.
So is there a way of getting all the data back as fast as possible from the
central server and then speedily deleting/updating rows of the local dataset
other than the way I've been testing. I'd wondered about creating some
in-local table (using DAO, for example) and then firing SQL at that ... but
I'm just not sure if that would be any quicker.
So in summary-
1) What's the quickest way of getting ALL the data from the server to the
local client?
2) What should I use on the local client?
Any thoughts would be greatly appreciated.
Thanks, again
SimonCan you post some more details, including DDL and some sample
data. There's a good chance this can all be achieved in a single
SELECT.|||<markc600@.hotmail.com> wrote in message
news:1137676726.723163.59050@.g47g2000cwa.googlegroups.com...
> Can you post some more details, including DDL and some sample
> data. There's a good chance this can all be achieved in a single
> SELECT.
Thanks for replying.
It will take me a while to get this together, I'm afraid.|||On Thu, 19 Jan 2006 12:40:29 -0000, Simon Woods wrote:

>Hi
>I've been playing around with temporary tables ... everything works fine,
>but I've read a few articles saying how they can affect overall performance
>and I'm wondering if there may be a faster alternative performance.
>Basically, the user fires off a query at the DB. With the results, I need t
o
>delete some rows which are retrieved as well (they cannot be filtered out a
s
>part of the SELECT statement because of the effects of LEFT JOINS) as amend
>certain data values prior to returning the data to the user. I thought
>temporary tables would be the way, so I could just
>1) Create Temp Table
>2) Populate with INSERT
>3) DELETE rows fulfilling certain criteria
>4) UPDATE certain data values
>5) Return contents of Temporary Table to user as ADO dataset.
>I'm wondering now if I could actually get all the Data back from the result
s
>of the initial SELECT statement to the local machine fast, and then run the
>DELETE/UPDATES locally so as to remove load from the central server.
Hi Simon.
Either I am completely misunderstanding what you're writing, or you are
completely misunderstanding how SQL Server works.
From this message, it appearrs as if you want to get all rows to the
client, then iterate over the results and send update or delete
statements for each row "to remove load from the server".
However, since the data is stored on the server, you'll have to do the
deletes and updates on the server as well. This means that in your
version, you'll have lots of network traffic and you force SQL Server to
do individual updates and deletes in the order that your application
processes them.
It's much better to keep all the logic in a single stored procedure on
the server. Not only will this save you lots of network traffic, it will
also enable the query optimizer to pick the best order of execution for
the updates and deletes (provided you write them as set-based UPDATE and
DELETE statement, of course).
Hugo Kornelis, SQL Server MVP|||Hugo ... thanks for replying ... see below
"Hugo Kornelis" <hugo@.perFact.REMOVETHIS.info> wrote in message
news:qp50t15rumqtlr0din5fdbqc4tlh1g01kf@.
4ax.com...
> On Thu, 19 Jan 2006 12:40:29 -0000, Simon Woods wrote:
>
> Hi Simon.
> Either I am completely misunderstanding what you're writing, or you are
> completely misunderstanding how SQL Server works.
... the latter is most likely ...

> From this message, it appearrs as if you want to get all rows to the
> client, then iterate over the results and send update or delete
> statements for each row "to remove load from the server".
> However, since the data is stored on the server, you'll have to do the
> deletes and updates on the server as well. This means that in your
> version, you'll have lots of network traffic and you force SQL Server to
> do individual updates and deletes in the order that your application
> processes them.
On the client I'm actually wanting a read-only set of data so I don't want
to do any updating of the source data only issuing SQL to update and delete
the temporary table.
Having chatted to some of my colleagues, I've been pointed to ADO
disconnected datasets to try and move load away from temporary tables on the
server to recordset processing on the client. However, at the moment my
preference is to go with the temporary table solution - but I'll have to do
a bit more digging.
Thanks again|||On Fri, 20 Jan 2006 13:06:32 -0000, Simon Woods wrote:
(snip)
>Having chatted to some of my colleagues, I've been pointed to ADO
>disconnected datasets to try and move load away from temporary tables on th
e
>server to recordset processing on the client. However, at the moment my
>preference is to go with the temporary table solution - but I'll have to do
>a bit more digging.
Hi Simon,
I'm not familiar with ADO, so I can't comment on the disconnected
datasets. But considering that SQL Server is designed to do quick and
efficient manipulations on large amounts of data, I'm very much inclined
to agree with your preference of doing it on the server.
Hugo Kornelis, SQL Server MVP

No comments:

Post a Comment