Sunday, February 12, 2012

Analysis Services Data Access Auditing with msmdqlog.mdb

I’m starting to look at using a data warehouse and would like to have some
tracking of the queries ran against the data warehouse.
I’ve gone through several articles about the MS Access database msmdqlog.m
db
and I’m just looking for ways to implement the auditing and I have a few
questions.
I guess the easy thing is to generate reports from the Access database using
Access itself to help me figure out if who is looking at the data, how long
it takes the queries to run… But my real question is are there better thin
gs
to do with it.
I read that it might be best to migrate the logging away from Access and
into SQL Server directly but following that thought it lead me to some horro
r
stories about the many registry entries that need to be modified and the
pains encountered when updating service packs and the like. Seemed that
Analysis Services was tied in many ways to the msmdqlog.mdb for this type of
auditing.
I was also considering the possibility of creating a cube in the data
warehouse based on the QueryLog table from the msmdqlog.mdb
What are some of the prevailing ways to report on usage statistics from
Analysis Services?
Thanks for any help.
LarryWhat we do is import the logs from the Access database into SQL Server
on a regular basis with a DTS.
Then report off of the SQL Server.
Paul|||Larry,
If you have the space I would move the log, it will speed things up a little
and give you better access to the data you want. I've never heard of any
horror stories, I sure didn't have any problems.
Here are 2 links to help you get started -
http://www.sqlservercentral.com/col...ce
s.asp
[url]http://www.sqlservercentral.com/columnists/rbalukonis/analysisaboutdimensions.asp[
/url]
The first is a basic intro to building a basic cube. If you want you could
always adapt it to suit the things you want to analyse (eg. Usage by cube by
user by hour etc.)
I have a cube with Users; Cubes; Time (month day hour) and hours dims, my
measures are usage (no. of times a user appears in the log) and query
duration.
Realistically you can only get a feel good factor with the data, why?
because usually you only log every 10 queries so if user a makes 9 queries
and user b makes the 10th user b looks good. Also depending on the front end
each time a user does a new query the client cache may answer the query and
therefore not show on the log. There are quite a few things that mean you
don't gat an accurate picture of who’s doing what, but the log is a fair
yardstick.
I use the log as a gauge on cube utilization/ importance and user usage,
preheating the cache at peak periods and reporting to managers. Personally
query duration for me is not that important, even though I know it should be
.
Data is also available to analyse by dimension, it is fairly hard to
translate and prone to problems when dimensions are radically changed or
added. If it was reliable it would prove very beneficial. Have a look you ma
y
find it of great benefit.
Hope this helps you start out.|||Thanks for the help but I ran into a problem.
I read and followed the article “Analysis About Analysis Services” I
m
having problems
though with authentication. Is there any type of authentication tied to the
AS and the Access database?
I imported the msmdqlog.mdb into a new SQL Server 2000 database (there were
about 30 records).
I changed the registry entries QueryLogConnectionString and
RemoteQueryLogConnectionString to use the SQLLOLBE data provider, Initial
catalog and server.
I created the history table and then decided to test it out by querying the
FoodMart cube with Excel and the MDX Sample Application. No results were
logged. I returned to the msmdqlog.mdb file and nothing was added there
which I was glad to see. But I can’t get new entries added to the QueryLo
g
table in SQL. I figured something is wrong with the connection string.
I went to Analysis Services server properties and clicked Clear Log, I
checked the QueryLog table in the records I had brought over from the Access
were cleared from the SQL table.
I went to Event view and found a message that read:
Event Type: Error
Event Source: MSSQLServerOLAPService
Event Category: Server
Event ID: 124
Date: 11/30/2004
Time: 12:03:32 PM
User: N/A
Computer: ComputerName
Description:
Relational data provider reported error: [Cannot open database requested
in
login 'OLAP_Audit_Tracker'. Login fails.;42000].
For more information, see Help and Support Center at
http://go.microsoft.com/fwlink/events.asp.
I’ve changed SQL Security and made all the appropriate changes but still g
et
stopped at the authentication issue.
Any thoughts?
Thanks
Larry
"HWUK" wrote:

> Larry,
> If you have the space I would move the log, it will speed things up a litt
le
> and give you better access to the data you want. I've never heard of any
> horror stories, I sure didn't have any problems.
> Here are 2 links to help you get started -
> http://www.sqlservercentral.com/col...
ces.asp
> http://www.sqlservercentral.com/col...ons.as
p
> The first is a basic intro to building a basic cube. If you want you could
> always adapt it to suit the things you want to analyse (eg. Usage by cube
by
> user by hour etc.)
> I have a cube with Users; Cubes; Time (month day hour) and hours dims, my
> measures are usage (no. of times a user appears in the log) and query
> duration.
> Realistically you can only get a feel good factor with the data, why?
> because usually you only log every 10 queries so if user a makes 9 queries
> and user b makes the 10th user b looks good. Also depending on the front e
nd
> each time a user does a new query the client cache may answer the query an
d
> therefore not show on the log. There are quite a few things that mean you
> don't gat an accurate picture of who’s doing what, but the log is a fair
> yardstick.
> I use the log as a gauge on cube utilization/ importance and user usage,
> preheating the cache at peak periods and reporting to managers. Personally
> query duration for me is not that important, even though I know it should
be.
> Data is also available to analyse by dimension, it is fairly hard to
> translate and prone to problems when dimensions are radically changed or
> added. If it was reliable it would prove very beneficial. Have a look you
may
> find it of great benefit.
> Hope this helps you start out.
>|||First place to look is the Logging tab in properties of your analysis
services. Where it says "write to log once per ? queries" is how many
queries have to be sent to the server before a log is entered. If the value
is high your tests may not have made enough queries. The default should be
10. Also remember that pivot table services uses intelligent caching,
therefore, doing ten queries on a pivot table men not generate ten requests
to the server and the client computer may be able to deduce the query from
cache. To do a test you could reduce this to 1 and return it to a more
suitable number later.
If you feel everything is OK with the logging then look at the
authentication. Does the user have adequate permissions, what is the users
default database?, is the user a member of the analysis services
administration, is the Analysis Server Service a domain account etc.
check out - http://webservertalk.com/t943017.html and see if this discussion
is
simmilar to your case. Also see http://support.microsoft.com/?id=224973
Have a look at those to begin with, if you still have problems we'll I may
need more information.

No comments:

Post a Comment