Saturday, April 2, 2022

Setup is Split Across Multiple CDs

Setup is Split Across Multiple CDs

Lately I've seen a bunch of people hitting installation errors that have to do with the fact that SQL Server is now split across multiple CDs. During a SQL installation you may get an error (it shows up in SQLSetup####_*_Core(Local).log file) that looks like this:

Error: Action "InstallToolsAction.11" threw an exception during execution.  Error information reported during run:
Target collection includes the local machine.
Fatal Exception caught while installing package: "11"
        Error Code: 0x80070002 (2)
Windows Error Text: The system cannot find the file specified.

  Source File Name: sqlchaining\sqlprereqpackagemutator.cpp
Compiler Timestamp: Tue Aug 30 21:43:50 2005
     Function Name: sqls::SqlPreReqPackageMutator::modifyRequest
Source Line Number: 196


This is a somewhat unfriendly way of saying it cannot find CD 2 (the Tools CD) or it cannot find the .\Tools directory. The SQL server installation files are split between two folders "Servers" (CD1) and "Tools" (CD2). You normally run setup from .\Servers\setup.exe (or CD1 .\setup.exe). If the installation files are copied locally or to a fileshare, in order for this to work properly you need to make sure that 1) the tools directory exists and 2) you have access to it. If you've dumped the install files to a network share and mapped a drive to the share you need to be sure to map to the parent folder of Servers and not directly to Servers. For example: \\some_machine\some_share\Apps\Microsoft\SQL2K5\SeptCTP\. Let's say under here are the .\SeptCTP\Servers and .\SeptCTP\Tools folders. Your network share needs to point to SeptCTP and not Servers. If you mapped the G: drive then you would run setup as G:\Servers\Setup.exe.

If you burn the two images to two CDs and you run a GUI install you will be prompted to insert CD2. Howeverm if you do a silent install from the CD it will crash if you ask to install tools - in a silent install we have no way of prompting the user for CD2. To run silent installs, either burn a DVD or copy the files locally (or to a network share). If you grab the DVD image and burn everything to a single DVD you won't be prompted for CD2 - setup will automatically find the Tools folder on the media.

I hope this helps!

Dan

pivot table % and $ Formatting ?

Hi Guys

Just wondering after I've formatted certain facts say ie. Format String: Percentage and Format String: Currency all values are formatted correctly in AS however when I load the data in Excel pivot table all formatting is lost.

Is there anyway that I can allow formatting in pivot table ?

Thanks
Tom

Reply:
Same problem here, and same interrogation.

Antoine.

------------------------------------

question about Dimensions...

I'm reading Microsoft 'SQL Server 2000 Analysis Services' by Reed Jacobson and have come across a passage which I don't understand. Being fairly new to this I was hoping someone could help:

"Consider this extreme example: with 128 members in a single dimension, a report has 128 possible values, but with those same 128 total members split up into 64 dimensions - with two members in each dimension - a report has 18, 446, 744, 073, 709, 551, 616 possible values."

I've got no idea where he gets those figures from. Can you help?

cheers

j.

Aggregating time stamp data for a time series plot

I have an application that records data points using timestamps. 
For example:

START        FINISH        VALUE
01/01/2005  02/25/2005  5
02/25/2005  04/16/2005  3
04/16/2005  06/03/2005  7

I need to be able to plot the data as a time series prorated across the time "buckets" (think of a stock ticker).  I have done it successfully in T-SQL but was looking for a faster way.  Is this something analysis services could do without having write a complex procedure to "bucketize" the data?  I am new to analysis services so am not quite sure of it's capabilities.

Sept CTP - Writeback - Out of memory error

I have been attempting to do a simple writeback example for a client.

I started using the Excel Add-In and found a whole host of crazy issues, with the finale being that when you Commit the What-If session, nothing happens. I mean nothing. I turned on the profiler, and found that not a single update command gets sent.

So I went the "old-School" way and wrote the Update Cube .... MDX, a very simple one that updated a leaf value in the AW Direct Sales cube. After a few minutes of 100% CPU I get an "out of Memory" error.

I made sure that Writeback was enabled, created the writeback partition, processed the partition from SSMS to make sure it was working ok, created the writeback table, marked all the dimensions writeback, added the additional role as administrator and added myself (most of this seems execessive, but it was 2AM and I have a demo coming)

Any help? Does the Sept CTP even have writeback working? Can someone send me the steps to reproduce on AW? I am trying to sell the client on SQL 2005, but if I can't get this to work I am dead in the water.

Steve

problems with the excel-addin

Hello,

I tested the excel-plugin for analysis services and had some odd problems.
http://www.microsoft.com/downloads/details.aspx?FamilyID=dae82128-9f21-475d-88a4-4b6e6c069ff0&displaylang=en

When I build freeform reports, they work fine first until  I get only "#Value". If I refresh the sheet, I'll get the message "Cannot verify connection" in spite of it is definitively correct.

Have others also had problems with this Add-in?

Tobias

MDX problem with AS2005 CTP Sep 2005

I migrated the Foodmart 2000 sample database to AS2005. Executing the following MDX query works fine:

select {[Measures].[Store Sales]} on 0, Descendants([Store].[All Stores], [Store].Levels(2), LEAVES) on 1 from sales

10 "store states" are returned as expected.

If I add NON EMPTY like in the following query

select {[Measures].[Store Sales]} on 0, NON EMPTY Descendants([Store].[All Stores], [Store].Levels(2), LEAVES) on 1 from sales

the store states without "Store Sales" values are discarded, but now the result additionally contains the "Store Country" member "USA". Why?
"USA" is neither a member of the "Store State" level nor is a leaf member.

Can anybody explain this?

Thanks

Guido

'provider not found or it may not be installed properly error' in analysis services

hi,
i am new to olap services..
i am getting an error 'provider not found or it may not be installed properly error' while doing the browse data option on cubes.can any body tell what exactly is the error.
is it the analysis services is not installed properly.


thanx in advance...

Error while generating CUBE "(-2147217900) Invalid Column name 'ASSN_UID'"

Hi,

While trying to generate OLAP CUBE with "Project Web Access"(Microsoft Project Package 2003) I got the following error.

"Current Cube Status

   The Cube scheduled to be built on 5/24/2005 at 2:42 P.M Failed.
   (-2147217900) Invalid Column name 'ASSN_UID'.
"

Please advice me how I can resolve this issue.


Regaqrds,
Sudhakar

Reply:
You may want to post this question on the SQL Server Analysis Services forum at http://forums.microsoft.com/msdn/ShowForum.aspx?ForumID=83.

------------------------------------
Reply:
HI,
 we have got the same Problem. If you have got any answer, please tell us.

Regards

MT


------------------------------------
Reply:
Hi,

I have this problem just after installing project server 2003 SP2...
Any ideas to fix the problem without project reinstalltion?

Thx

Az.

------------------------------------

Partition data file disappeared!?!

We are running AS 2000 (latest and greatest) on an 8 processor, 16 GB RAM, 64 Bit Itanium server.  We were processing some partitions in parallel when we received the following error while processing (using the DSO library):

File error [during the Open operation] 'Projection_3fact.data The system cannot find the file specified. '

It seems the .data file for one of our partitions just disappeared.  Any ideas how this could happen?

Here is the relevant section of the processing log...
******************************************************
10/7/2005 1:11:06 AM 19647         Writing aggregations and indexes of Partition 'Projection_3' (segment  38)
10/7/2005 1:11:06 AM 19647         Writing aggregations and indexes of Partition 'Projection_3' (segment  39)
10/7/2005 1:11:06 AM 19647         Writing aggregations and indexes of Partition 'Projection_3' (segment  40)
10/7/2005 1:11:06 AM 19647     Completed Processing Partition 'Projection_3'. End time: 1:11:06 AM  Duration: 0:01:51  Rows processed: 2560734
10/7/2005 1:11:06 AM 19647     Merging partitions of Partition 'Projection_3' Start time: 1:11:06 AM
10/7/2005 1:11:06 AM 19647         Initializing Partition 'Projection_3'
10/7/2005 1:11:07 AM 19647         Merging partitions of Cube 'Projection' Start time: 1:11:07 AM
10/7/2005 1:11:07 AM 19647             Merge in progress: Merging partition '~Projection_3' into 'Projection_3'
10/7/2005 1:11:07 AM 19647             Analysis Server Error: File error [during the Open operation] 'Projection_3fact.data The system cannot find the file specified. ' ; Time:10/7/2005 1:11:07 AM
10/7/2005 1:11:07 AM 19647             Analysis Server Error: File error [during the Open operation] 'Projection_3fact.data The system cannot find the file specified. ' ; Time:10/7/2005 1:11:07 AM
10/7/2005 1:11:07 AM 19647         File error [during the Open operation] 'Projection_3fact.data The system cannot find the file specified. ' ; Time:10/7/2005 1:11:07 AM
10/7/2005 1:11:07 AM 19647     Processing Cube 'Projection' failed. No changes have been made to the database.

Problems with Service Pack 4 for Analysis Services

Hi All,

I just installed Analysis Services SP 4 on our database server.

We are having major problems with our DTS packages and our OLAP cube processing.

Anybody got any ideas?

I did have to apply the  "msmdsvr.exe" fix before installing Analysis Services.  This is documented fix by which the msmdsrv.exe file is renamed prior to the installation of AS SP4.

Symptoms:
  Cubes not completing processing or taking much longer than before.

Some troubles with building PivotTables in C# and Rendering XMLA

I'm writing a C# class that builds a very basic Pivot Table object and takes the xml data from that object and populates an html object param.  I'm using the latest OWC 11.0 v. 11.0.0.6555 and OLAP 9 from the Sept. release.  

One thing that I see when stripping everything down to the XMLA is that the XML output has some closing tag violations.  Both "QuerySource" and "PTSource" are not closed out.  Does anyone know about this?   

Second, the XMLA from a spreadsheet (once exported to html) is very different than the XMLA that the C# object is generating.  First, the default MSOLAP v. is 2.  For the short-run, I'm hardcoding this attribute (see c# code below).  Second, the nodes do not match up.  Look very much like the nodes of OLAP 8.  Do I need to re-register OLAP 9 because OLAP 8 might still be the default?

Any guidance on this will be greatly appreciated and shared with others. 
Thanks
Tristian

Here's some source code.

XMLA Bug:
<XML xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:x="urn:schemas-microsoft-com:office:excel" xmlns:html="http://www.w3.org/TR/REC-html40">
  <WORKSHEETOPTIONS xmlns="urn:schemas-microsoft-com:office:excel">
   <ZOOM>
    0
   </ZOOM>
   <SELECTED/>
    <TOPROWVISIBLE>
     2
    </TOPROWVISIBLE>
    <PANES>
     <PANE>
      <NUMBER>
       3
      </NUMBER>
      <ACTIVEROW>
       2
      </ACTIVEROW>
      <RANGESELECTION>
       $A$3:$D$1856
      </RANGESELECTION>
     </PANE>
    </PANES>
    <PROTECTCONTENTS>
     False
    </PROTECTCONTENTS>
    <PROTECTOBJECTS>
     False
    </PROTECTOBJECTS>
    <PROTECTSCENARIOS>
     False
    </PROTECTSCENARIOS>
  </WORKSHEETOPTIONS>
  <PIVOTTABLE xmlns="urn:schemas-microsoft-com:office:excel">
   <PTSOURCE>
    <DATAMEMBER>
     Business Unit- Position
    </DATAMEMBER>
    <CACHEINDEX>
     1
    </CACHEINDEX>
    <VERSIONLASTREFRESH>
     2
    </VERSIONLASTREFRESH>
    <REFRESHNAME>
     John Bergere
    </REFRESHNAME>
    <REFRESHDATE>
     2005-10-05T11:14:40
    </REFRESHDATE>
    <REFRESHDATECOPY>
     2005-10-05T11:14:40
    </REFRESHDATECOPY>
    <NOSAVEDATA/>
     <CUBESOURCE/>
      <QUERYSOURCE>
       <QUERYTYPE>
        OLEDB
       </QUERYTYPE>
       <COMMANDTEXT>
        Business Unit- Position
       </COMMANDTEXT>
       <COMMANDTYPE>
        Cube
       </COMMANDTYPE>
       <MAINTAIN/>
        <CONNECTION>
         Provider=MSOLAP.3;Cache Authentication=False;Persist Security Info=True;User ID=&amp;quot;&amp;quot;;Initial Catalog=BusinessUnit;Data Source=DALDEV4;Impersonation Level=Impersonate;Location=DALDEV4;Mode=ReadWrite;Protection Level=Pkt Privac;Auto Synch Period=20000;Default Isolation Mode=0;Default MDX Visual Mode=0;MDX Compatibility=0;MDX Unique Name Style=0;Non Empty Threshold=0;SQLQueryMode=Calculated;Safety Options=2;Secured Cell Value=0;SOURCE_DSN_SUFFIX=&amp;quot;Prompt=CompleteRequired;Window Handle=0x40126;&amp;quot;;SQL Compatibility=0;Compression Level=0;Real Time Olap=False;Packet Size=4096
        </CONNECTION><!--bug that this didn't close-->
      </QUERYSOURCE><!--bug that this didn't close-->
     </PTSOURCE>
  </PIVOTTABLE>
 </XML>


C# Class

using System;

using System.Data;

using System.Configuration;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

using System.ComponentModel;

using Microsoft.Office.Interop.Owc11;

/// <summary>

/// Summary description for SamplePivotTable

/// </summary>

namespace HBK.Web.Control

{

public class PivotTable : WebPart//, IWebPart, IWebActionable

{

private PivotTableClass oPTClass;

private PivotView oPivotView;

Button _btnSaveSettings = new Button();

private HiddenField _hiddenSettings = new HiddenField();

HtmlGenericControl _ActiveXControl = new HtmlGenericControl();

HtmlGenericControl _ActiveXParamControl = new HtmlGenericControl();

Table _tbMain = new Table();

TableRow _trMain = new TableRow();

TableCell _tcMain = new TableCell();

private string _strXML;

private OlapConnEnum _olapconnectionstring;

[WebBrowsable]

[Personalizable]

[Bindable(true), DefaultValue(null)]

public OlapConnEnum OLAPConnectionString

{

get { return _olapconnectionstring; }

set { _olapconnectionstring = value; }

}

private string _datamember;

[WebBrowsable]

[Personalizable]

[Bindable(true), DefaultValue(null)]

public string DataMemeber

{

get { return _datamember; }

set { _datamember = value; }

}

private string _personalsettings;

[Personalizable]

public string PersonalSettings

{

get { return _personalsettings; }

set { _personalsettings = value; }

}

//Property of IWebPartAction

public override WebPartVerbCollection Verbs //Add Edit Columns verb to webpart

{

get

{

//Create Custom Verb

WebPartVerb saveStateVerb = new WebPartVerb("savepivottable", new WebPartEventHandler(SavePivotTableState), "document.getElementById('" + _hiddenSettings.ClientID + "').value = document.getElementById('" + _ActiveXControl.ClientID + "').XMLData;");

saveStateVerb.Text = "Save Changes";

saveStateVerb.Description = "Save changes to the table.";

//Create Collection of verbs

WebPartVerb[] wpverbs = new WebPartVerb[] { saveStateVerb };

//Return WebPartVerbCollection including base verbs

return new WebPartVerbCollection(wpverbs);

}

}

public PivotTable()

{

//

// TODO: Add constructor logic here

//

}

protected override void CreateChildControls()

{

base.CreateChildControls();

//Pivot Table

oPTClass = new PivotTableClass();

oPTClass.ConnectionString = ConfigurationManager.ConnectionStrings[_olapconnectionstring.ToString()].ToString();

oPTClass.CubeProvider = "MSOLAP.3";

//oPTClass.DataMember = _datamember;

//oPTClass.DisplayFieldList = true;

//Pivot View

oPivotView = oPTClass.ActiveView;

//oPivotView.FieldLabelBackColor = "#E8E8D0";

//oPivotView.FieldLabelForeColor = "#334D66";

//oPivotView.FieldLabelFont.Size = 10;

//oPivotView.Label.Visible = false;

//oPivotView.FieldLabelFont.Name = "Arial, Helvetica, sans-serif";

//oPivotView.HeaderHAlignment = PivotHAlignmentEnum.plHAlignCenter;

//oPivotView.HeaderForeColor = "#334D66";

//oPivotView.HeaderBackColor = "#E8E8D0";

//oPivotView.HeaderFont.Name = "Arial, Helvetica, sans-serif";

//oPivotView.PropertyValueFont.Color = "Red";

//oPivotView.TotalFont.Size = 10;

//oPivotView.TotalFont.Name = "Arial, Helvetica, sans-serif";

//oPivotView.TitleBar.ForeColor = "#E8E8D0";

//oPivotView.TitleBar.Font.Color = "#334D66";

//Hidden Settings

_hiddenSettings.ID = this.ID + "_hiddensettings";

_hiddenSettings.EnableViewState = true;

_tcMain.Controls.Add(_hiddenSettings);

//Active X Param Control;

_ActiveXParamControl.TagName = "param";

_ActiveXParamControl.ID = this.ID + "_xmldata";

_ActiveXParamControl.EnableViewState = true;

_ActiveXParamControl.Attributes["name"] = "XMLData";

//Active X Control;

_ActiveXControl.TagName = "object";

_ActiveXControl.Attributes["classid"] = "CLSID:0002E55A-0000-0000-C000-000000000046";

_ActiveXControl.ID = this.ID + "_pivottable";

_ActiveXControl.Attributes["width"] = "100%";

_ActiveXControl.Controls.Add(_ActiveXParamControl);

_tcMain.Controls.Add(_ActiveXControl);

_trMain.Cells.Add(_tcMain);

_tbMain.Rows.Add(_trMain);

Controls.Add(_tbMain);

}

protected override void OnPreRender(EventArgs e)

{

if (_personalsettings != null && _personalsettings.Length > 0)

{

_strXML = _personalsettings;

}

else

{

_strXML = oPTClass.XMLData;

}

_ActiveXParamControl.Attributes["value"] = _strXML;

oPTClass = null;

}

protected override void Render(HtmlTextWriter sHTML)

{

base.RenderBeginTag(sHTML);

_tbMain.RenderControl(sHTML);

base.RenderEndTag(sHTML);

}

//Button is clicked

private void SavePivotTableState(object sender, EventArgs e)

{

EnsureChildControls();

_personalsettings = _hiddenSettings.Value;

}

}

}


One or more errors occurred during processing the command

Hi Friends,

I'm getting an error 'Unable to open cellset. One or more errors occurred during processing the command', when I try to run the following query in MDX sample application.

with member [Measures].Test as
'IIF (IsError([Product].[All Products].[Drink].[Alcoholic Beverages]),
1000,
2000
)
'
SELECT
order({[Store].[All Stores].[Canada].[BC].Children},([Time].[1998].[Q4]),DESC) on rows,
{[Time].[1998].[Q4]} on columns
from
Sales
where
[Measures].Test


If I remove the IsError function from the definition of the measures [Measures].Test, the query runs successfully without errors. Or if I use IsError function to validate measures (For ex. IsError([Measures].[Profit]), the error does not occur.So, I guess the problem occurs only when IsError function is used to validate dimensions with multiple hierarchies or levels? Can this query be run in any other way to get the same result? Please get back with your views on this.

Thanks,
Niyas.

Analysis services stops

Hi guys,

I'm using sql 2005(june ctp) and installed patches.
When I try to browse my processed cube in either management studio or dev't studio, the analysis services stops.

Below is the error message from the event log.
The connection either timed out or was lost.
Unable to read data from the transport connection: An existing connection
was forcibly closed by the remote host. (System)

Any idea. Are there any configuration that I need to do.

Any help will be appreciated.

Thanks,

Reply:
I have the exact same problem when I try to process a large cube. The processing stops with same error message and the Analysis services i stopped on the server?

I havent found any reason og solution?

Thanks,

------------------------------------

Attribute Hierarchy Issue

Folks,

Hoping someone can help me figure out a problem I've run into.

After migrating the Foodmart 2000 database from AS 2000 to AS 2005, I want to make a series of changes to it to demonstrate some of the new features of AS 2005. Thus, I imported the new Foodmart 2000 database into a project in BIDS, then opened the Customer dimension.

In the Customer dimension, all of the attribute hierarchies are turned off. A single hierarchy named Customers exists with Country, State Province, City, and Name as the levels. If you browse the Sales cube, you'll see the hierarchy working fine.

Now, I wanted to enable the Education attribute hierarchy to compare the behaviour of attribute hierarchies to normal hierarchies, given that another dimension named Education Level exists that is based off of the education column in the customer table. So, I set the AttributeHierarchyVisible property for the Education attribute to true, changed nothing else, and redeployed and reprocessed the enter project.

After reprocessing, I opened the Sales cube for browsing within BIDS. I can browse anything in the cube (measures, dimensions, etc.) except for the Education attribute hierarchy. If I try to browse it, I get the following error message displayed in the browser window:

The query could not be processed: Two sets specified in the function have different dimensionality.

Now, if you expand the Education attribute hierarchy in the metadata viewer to the left of the browse window, you'll see that the Education attribute lists [All Customers] as the topmost member, with [Canada], [Mexico], [USA] as children. This is not right as the children of [All Customers] in the attribute hierarchy should be the list of education levels.

Also, I turned on Profiler to capture the MDX being executed when attempting to browse the new Education attribute hierarchy. It attempts to perform the following query:

CREATE SESSION SET [Sales].[{7F001518-E517-403F-A50A-0A2A82677443}Pivot1Axis1Set0] AS
'{ { [All Customers] }, AddCalculatedMembers([Education].[Education].MEMBERS) }'

Thus, it appears to be creating a set to include [All Customers] and the members of the new Education attribute hierarchy. Which makes sense. But executing that query usins SSMS results in the same error message as listed above.

So, does anyone know what might be causing this issue? Is there some other setting or some other change that I need to make to the Customers dimension and/or the Education attribute hierarchy to get this to work? Like I said, all I want to do is turn on the Education attribute hierarchy and be able to browse using it as well as the separate Educatio Level dimension to show how they compare.

Anyone? Thanks...

Dave Fackler

SQL Server 2005 beta 2. Calculated MEMBER error.

When you have configured your database case sensitive and you add a calculated measure to a cube in BI Development Studio it fails.
Cause BI Dev St makes [MEASURES].[New Measures] and when you try to develop the cube it raises an exception that can't find [MEASURES] dimension.
Solution: Edit by hand the measures and put [Measures] instead (with script view).
I hope it helps someone, cause it give me crazy.

Assigning Named set to Dimension Member

I currently have a dimension called Period, it is not of time type dimension.  The structure is [Year 2005 - 2006] - [Semi 1] - [Quarter 1] - [January 2005 - 2006].  I am have been struggling for a while now to figure out how to create a new member in period called CurrentMonth and somehow have it reference the correct Period dimension member.  I have looked at named sets, but do not know how to link up a 'Virtual' or Dynamic CurrentMonth Dimension member of Period to that named set.  Any ideas? 

Will Analysis Services allow loading data from my custom OLE DB data provider

Hello,
I am planning to develop a custom OLE DB provider for my buffered data. I tried loading data to Analysis Services from a CSV file using "Microsoft Text Provider".But i have found that such data loading is officailly not supported in Analysis services. So i was wondering if Analysis Services will allow me to load data from my custom OLE DB provider.
Any pointers in this regard are welcome.
Thanks in advance,

Message compression

We have run some tests on our application. Average message is about 2.5 MB. Messages are send once every 30 minutes. This is 3.5 Gb per month for one site. Now we already have three sites that will be sending this messages. This will be VERY high load on the WAN channel, and will cost us a LOT of money Tongue Tied.
Isn't it possible to add some compression? XML should compress very well, this feature will make brokers more useful.

Reply:
Service Broker messages are not required to be XML. They can simply be blobs. So you could perform the compression and decompression fairly easily in your app and send compressed messages through the Broker. That way you control the compression algorithms and not the system.

------------------------------------
Reply:

Some tips for reducing the traffic when using Service Broker:
- use binary messages instead of XML
- compress the message payload in the application, before calling SEND
- use one long lived dialog to send all messages, don't start a new dialog for every message
- few large messages is better than many small messages
- use short names for services, contracts, message types

I'd add that Service Broker is quite unchaty. For instance, there is no handshake for dialogs: first message SEND on the dialog is the first message sent on the wire.

HTH,
~ Remus


------------------------------------

September CTP available

Hi everyone -

Just wanted to let you know that the September CTP is now available
for download from the Microsoft site.

http://www.microsoft.com/sql/2005/productinfo/ctp.mspx


HTH...

--
Joe Webb
SQL Server MVP


~~~
Get up to speed quickly with SQLNS
http://www.amazon.com/exec/obidos/tg/detail/-/0972688811

I support PASS, the Professional Association for SQL Server.
(www.sqlpass.org)

OLAP Cubes problem

OLAP Cubes problem

Hi,

I have a fact table where two of its columns refer to the same dimension. I wanted to analyze the movement between these two columns.

Example:

columns:

ID region1 region2
1 tor ny
2 tor tor
3 tor miss
4 miss miss
5 miss miss
6 miss tor
7 ny miss
8 ny ny
9 ny ny

My cube should look like this:
region2
region1 tor miss ny
tor 1 1 1
miss 1 2
ny 1 2

What do i need to do?

Thanks a lot for all your help.


Reply:

Posting this on the Analysis Services forum should be step 1.


------------------------------------

Creating a dimension based on uniqueidentifier

Is it possible to base a dimension on a uniqueidentifier?  My company uses uniqueidentifiers for all primary keys in our tables.  However, I am now trying to create a parent-child dimension in Analysis Manager using one of our tables but I always get the error

Unable to count the members of the 'PK' level.

Unable to open the record set.

Error: The count-unique aggregate operation cannot take a uniqueidentifier data type as an argument.

Any suggestions?

Reply:
Note:  this is in AS 2000. 

Anyone know if this is possible in 2005?

------------------------------------

Connecting to Analysis Services 2000

Hi,

I am facing a problem to connect to AS 2000 (Analysis Services) on remote machine. I  have built a DLL "connection.dll" on .net framework using ADOMD.net libraries. I am exporting some functions in  "connection.dll" library (which is .net dll ) to connect and read data from  AS 2000.

Now problem is that whenever I try to connect to remote machine using "connection.dll" by calling it through other dll called "libReader.dll" (which is a regular C++ DLL), it gives error "database 'Foodmart 2000' does not exist'(Foodmart 2000 is a sample database and gets deployed with AS2000 by default). However connection is successful if we are connecting to local AS 2000.

The following scenario is not working:-

 libReader.dll -------------------> (Connect (a,b,c) - Function call) -----------------> connection.dll --------------------------> ADOMD.net ---------------------> AS 2000 (on Remote m/c)

Now whenever I use "connection.dll" in Win32 App "Disp.exe" for connecting to remote machines, It is not giving any error and is able to connect to AS 2000 -> FoodMart 2000 database.

The following scenario is working:-

Win32 Console App --------------------> (Connect (a,b,c) - Function call) -----------------> connection.dll --------------------------> ADOMD.net ---------------------> AS 2000 (on Remote m/c)

The calling mechanism should as described here.

In both cases "connection.dll" is very same still it is not able to connect to AS 2000 server if called from "libReader.dll" on other hand it is not giving any error if called from W32 app. Could you please let me know if I am missing anything or you would like to have more inputs?

Thanks
neoonwheels

SQL Server 2005 has shipped!

This morning at 9 AM in shiproom, we did official signoffs for SQL Server 2005!

SQL Server 2005 is in the bag!


Reply:
Well Done!

------------------------------------
Reply:
That is very exciting news! :)

------------------------------------
Reply:
Congratulations !
I am eager to get it going.

Does it require to uninstall June CTP / Sept CTP before installation?

thanks,
Nitesh

------------------------------------
Reply:

------------------------------------
Reply:
Muy buenos dias
con quien tengo el gusto
 Leonardo Rodriguez wrote:

------------------------------------

Pivot table published in a web page : Error (Data mining)

Hello,

I've created an OLAP cube in SQL server 2005 Analysis Services, i browse it successfully, and i was able to use pivot table in Excel 2000 with no problems. But when i try to save the result in a web page in order to publish it in a html page, i get this error:

Impossible de connecter la liste de tableau croisé dynamique à la source de données "DataBase". Pour  plus d'informations sur la source de données, consultez le créateur du fichier.
0x80004005: "Error (Data mining): Either the user, "My profile", does not have permission to access the referenced mining model, "DataBase", or the object does not exist."


I used to publish the results in a web page using OLAP services for SQL Server 2000 with no problems.

Any help is appreciated.

Regards,
Fadoua

calculations scripts for aggregating on a dimension with several hierarchies

Hi everybody,
I am using the September SQL Server 2005 CTP version.
I have a cube with five dimensions and several measures. I want to aggregate all the measures except one through the "sum" function along all the dimensions.
On the contrary I have a measure (called [Valore]) which I want to aggregate with the Sum function along all the dimensions except one dimension called [DomMese]. Along all the hierarchies of this dimension, in fact, I would like to aggregate that measure using the firstchild (or, even, using "average").
I tried to add in the Calculations script the following piece of code, using the "Scope" statement:

SCOPE ([Measures].[Valore]);

  /*"DomMese" dimension has to hierarchies: [Hierarchy] and [Hierarchy1]*/
   SCOPE([DomMese].[Hierarchy].allmembers);

 /*"Hierarchy" hierarchy has three levels, from highest to lowest: [Livello03]->[Livello02]->[Dom Mese], where the [Dom Mese] is the attribute with Key Usage*/
        SCOPE     ([DomMese].[Hierarchy].[Livello02].members);

            this =      [DomMese].[Hierarchy].CurrentMember.firstchild;    

        END SCOPE;

        SCOPE     ([DomMese].[Hierarchy].[Livello03].members);

            this =      [DomMese].[Hierarchy].CurrentMember.firstchild; 

        END SCOPE;

    END SCOPE;

   SCOPE ([DomMese].[Hierarchy1].allmembers);

/*"Hierarchy1" hierarchy has two levels, from highest to lowest: [Livello01]-> [Dom Mese], where the [Dom Mese] is the attribute with Key Usage*/
        SCOPE ([DomMese].[Hierarchy1].[Livello01].members);

            this = [DomMese].[Hierarchy1].CurrentMember.firstchild;

        END SCOPE;

   END SCOPE;

END SCOPE

 

The results I can see after processing are not correct, in fact I get the right aggregated values only for the second hierarchy (Hierarchy1), whilst I loose the right results for the first Hierarchy, which is aggregated using the Sum functions.

If I put only the following code in the calculations script (I cut the part related to hierarchy [Hierarchy1], I get the right results for the first hierarchy [Hierarchy], but results completely wrong for the other Hierarchy (neither FirstChild, neither Sum, but some wrong and repeated aggregated values and some empty aggregated values)

SCOPE ([Measures].[Valore]);

    SCOPE([DomMese].[Hierarchy].allmembers);

        SCOPE     ([DomMese].[Hierarchy].[Livello02].members);

        this =    [DomMese].[Hierarchy].CurrentMember.firstchild;    

        END SCOPE;

        SCOPE     ([DomMese].[Hierarchy].[Livello03].members);

        this =    [DomMese].[Hierarchy].CurrentMember.firstchild; 

        END SCOPE;

    END SCOPE;

END SCOPE

It seems I am not using the right way to aggregated through several hierarchies, can anybody help me?

I would appreciate it very much. Thank you.

 

Excel Pivot Tbl-AS2005

Hi,

I have a cube that I created in AS2005, I have a hierarchy and drillthrough action defined.  When I create a pivot table based on that cube, I can not drill down to the next level.  My hierarchy is based on Dimension Facility. Facility State is root level, facility zip is next level. Can anyone lend a hand?
thanks!

aggregations on cube dimensions

Hi everybody.
I am trying to set up a cube (in sql server 2005 Sep CTP) where one of the measures has to be aggregated through the "Min" function.
If I set up the "AggregationFunction" property of that Measure in the Cube to "Min" and process it, it just collects the maximum value from the records of my fact table for each cell of the cube, and then aggregates with the "Min" function all the hierarchies of all the dimensions, but this is not what I need.
In fact I should collect through "sum" the records at atomic level, and still aggregate through "sum" along all the hierarchies of all the dimensions except one, where I would like to aggregate on all the hierarchies of that dimension through the "Min".
Which is the best way to achieve such a results? Should I use the CalculationScripts? If yes, could someone write me a sample code? In fact I could not find any examples helping me to achieve this goal...
Thanks you very much.

Using a cube as a data source

Hi all:

Is there a way in SSAS 2005 to create a cube/measure group from another cube/measure group ?

Thanks.

Suranjan

how to connect sql server 2005 cube using visual studio 2005

May i know how to connect sql server 2005 cube using visual studio 2005 ?
May i have some examples and source code to solve this problem ?

Thanks a lot.

Analysis Services Project Template Failed installation... a "Project Template" issue

I installed SQL 2005 AS (Analysis Services Proj.) and it's installation failed on the moment of adding VS2005 support therefore it failed to properly add the project template resources into VS.  I remove and reinstalled VS2005 and SQL2005 various times (including all related resources and directories) trying to get the issue resolved and still I am getting the same issue.

I am using another approach now, how can I setup the SQL 2005 integration tools into VS2005?

When I try to create a new AS project it tells me that it can't find the ".pwproj" file and shows me the path where it is located, still even that the file is actually where it is trying to find it.  My guess is that project template or integration package installation didn't complete it is broken and needed resources are not set appropiately.

Again, how can I reran the SQL VS2005 integration tools?
 
If this is the wrong group to ask this question, can you tell me where I could get my answer?

Analysis services: Locking partitions

Table A in SQL Server 2005 is split in partitions where the oldest partition will be deleted when the new partition fills up. Data aggregation using Analysis Service is powerful, but when syncronizing against the database, source data from the deleted SQL Server partition/file group is removed and aggregations in Analysis Service will be updated.

Is there any possibility to avoid updating the data for certain partitions and lock the data in partitions in Analysis Service?


Referenced Table Dimension worked in June SQL Release but fails in Sept Release

I have a SQL table named PkgGroup that is related to the Item table, the Item table is then related to the MonthlyTrans fact table with the measures I want to aggregate.  I used the Dimension Usage Tab of the Cube Designer to designate that the PkgGroup table was related by reference to the Item table.  In the June release this worked fine and my table processed correctly.  When I use the same Analysis Services Project that had been created with the June release against the same SQL Server database that I had Backed Up and then Restored with the installation of the Sept release, I get an error message that includes this line - "The attribute key cannot be found: Table: dbo_PkgGroup, Column: PkGrpNo, Value: 73087".  I have rechecked the Tables, the Relationships, and the underlying data and it all seems to be in order.  Is the  new release not working correctly or is it something I'm doing wrong?

Sept CTP: Distinct count agg differs when using WHERE clause and Aggregate(set)

Our cube has 9 dimensions, and several measure groups.  The measure group involved in this issue, is a distinct count of customers.  One of the dimensions is a time dimension, and it is marked as such since we have a semi-additive measure in another measure group.  More on that later.

We've been using 2005 June CTP for our test environment, and recently installed the newer 2005 Sept CTP.  That's when our problems started.  Our client needs the ability to place several groups of members from one dimension on rows/columns.  To meet this requirement, our webclient generates MDX that uses the Aggregate function.  (See below)

When testing data correctness from the cube, we quickly found that the Aggregate function has changed from the June CTP to the Sept CTP, and this is also indicated in Mosha's blog. 

Our understanding has been that, given max one member group from each dimension, the aggregation should be exactly the same using the Aggregate function compared to putting the member set in the WHERE clause of the SELECT statement.  And in the June CTP, this is indeed the case.  In the Sept CTP, however, it is not. 

We know the correct query results, both from our source SQL database, as well as from the fact table for the cube.  Our findings (compared to known correct numbers) are summarized below:

SQL Server version Aggregate query WHERE query
CTP June Correct Correct
CTP Sept WRONG! Correct


It did not matter, for the Sept CTP, whether or not the Time dimension was marked as type=Time or type=Regular; the results were the same.

The cube project used to deploy the cube onto the Sept 2005 CTP was originally developed on June 2005 CTP.  Before deployment and processing it was opened and edited somewhat in the Sept 2005 CTP SQL Server Business Intelligence Development Studio.

If anyone has any insights, we'd be grateful.

-Christian

-----------------------------------------------
MDX Aggregate version:

WITH 

MEMBER [Products].[several products] AS 'Aggregate({[Products].prod1, [Products].prod2, etc.})'

MEMBER [Time].[much time] AS 'Aggregate({[Time].[Q1], [Time].[Q2], etc.})'

SELECT

{ [Measures].[number of unique customers] } ON COLUMNS,

{ {[Location].&[112], [Location].&[115]}* {[Products].[several products]}* {[Time].[much time]}* {[CustomerType]&.[0]} } ON ROWS

FROM [Cube1]

-----------------------------------------------
MDX WHERE version:


SELECT

{ [Measures].[number of unique customers] } ON COLUMNS,

{ {[Location].&[112], [Location].&[115]}* {[CustomerType]&.[0]} } ON ROWS

FROM [Cube1]
where
(
 {[Products].prod1, [Products].prod2, etc.},
 {[Time].[Q1], [Time].[Q2], etc.}
)

Issues with .dwproj file needs resolution

I am trying to create a new "Analysis Services / BI" project and when I do it says that I can't since the "<Analysis Services...>.dwproj" is not installed.  I search for issues related to fixing this problem and the recommendations are to reinstall.

I think reinstallation is rather nasty.  There must be a way that I can add the "SQL Application VS tools" into VS2005 without spending an all the amount of time of a de/reinstallation.

So how can I install from VS2005 the MS-SQL VS200 tools without reinstalling the product?

My guess is that I can get .dwproj installed (that I don't have a clue on doing it or have seen/found any documenation on how to do it) I will do ok.

Ideas suggestions...

Deploying Excel Localcube

 

Hi friends,

 

Whenever I have created a local cube and have created an excel report from the local cube and after I send this report + the local cube to a client. I find that they often save the report + local cube to a different directory.

 

The problem is that the excel actually stores an absolute location that points to the cube. So when I've created the cube on my computer it could be c:\my.cub but when the client has unziped the report it could be in d:\reports\jan\my.cub . as a result Excel often complains of not being able to find the cube.

 

What can I do about this ?

 

Thanks

Tom

Analysis Services 2000, deleting a cube

    When I am in Analysis Manager and I try to delete a cube/dimension/database/measure by right clicking and selecting 'delete', Analysis Manager freezes up and mmc.exe uses 50 percent cpu (it stays like this forever till I 'end process'). 

I have tried reinstalling Analysis Services as well as Sql Server altogether but I still get the same problem whenever I try and delete something.  Any ideas?

SSAS newbie/simple cube/dimension creation question

I am starting with SSAS September CTP and am trying to build my first cube but it is not working at all.
I have 1 table named FACTtable with 4 fields, CustCode, Country,TranDate and Amount.
I am trying to produce a cube that will sum Amount by the dimensions CustCode and Country and also to have the TranDate as a dimension (visually along the top)with the ability to automatically sum up to month/quarter/year etc.
There is only 1 table involved so it is the fact and the dimension table.
I get errors about logical keys and if I do not use the trandDate it works fine.
The BOL are not very good as a tutorial, and I think that my example here is a very simple one.  I must be missing a very obvious base understanding of the tool set in SS2005.
Can anyone advise how I should create my cube/dimensions through the wizards to acheive the desired result?  Are there any documents that explain this in a clear way with no assumption of previous SSAS knowledge?
I know in SS2000 their is a 5-day course for this, but I am waiting for the SS2005 version of the course, but I cannot find the new course yet.

thanks in advance.

Sept CTP -- Interaction of calculated members in queries and cubes -- seems like a big regression

We frequently put calculated members and/or calculated cells in our cubes with solve orders greater than 1.  Our standard is that additive calculations are at 10, Share computations at 20 and Change/%change at 30.

In particular, we have a "market share" calculated member of a "companies" dimension that divides a company's data by the market total to arrive at a "share".  That Share is calculated at a solve order of 20 -- in the cube.

In our queries we frequently make local calculations at lower solve orders.  For example we might calcilate profit = sales-costs with a solve order of 0.  We would expect a querry of (Share,Profit) to (1) calculate profit for company and profit for market and (2) divide company profit/market profit to get a market share.  (This was the behavior in AS2K.)  However, what we find is that we instead get (1) calculate ( cost ,share) and (cost,share) and then (2) subtract to get (profit,share).  Clearely an incorrect answer.

Please tell me this is a bug that will be fixed!!

If not, the only way we can use SSAS is to either (1) Put ALL calculations in the queries or (2) put ALL calculations in the Cube.  But we can't mix them.

I think this is a show stopper for me.


Importance of the New SQL Server Windows Groups

Those of you who have installed SQL Server 2005 may have noticed that the installation creates several new Windows groups on the server.  Do not underestimate the importance of these groups.

I had been unable to configure Transactional Replication.  Everything worked fine until I tried to start the Distribution Agent.  The agent started just fine, but it reported the following error:



Date  10/19/2005 08:29:50 AM
Log  Job History (YKOADS3-CLGMMCADHOC-ADHOC_Pub-YKCLNSD-3)

Step ID  2
Server  YKOADS3
Job Name  YKOADS3-CLGMMCADHOC-ADHOC_Pub-YKCLNSD-3
Step Name  Run agent.
Duration  00:03:19
Sql Severity  0
Sql Message ID  0
Operator Emailed  
Operator Net sent  
Operator Paged  
Retries Attempted  0

Message
-XSTEPID 2
   -XSUBSYSTEM Distribution
   -XSERVER YKOADS3
   -XCMDLINE 0
   -XCancelEventHandle 000006A8
2005-10-19 12:33:04.568 Startup Delay: 4315 (msecs)
2005-10-19 12:33:08.896 Connecting to Distributor 'YKOADS3'
2005-10-19 12:33:09.021 Initializing
2005-10-19 12:33:09.037 Parameter values obtained from agent profile:
   -bcpbatchsize 2147473647
   -commitbatchsize 100
   -commitbatchthreshold 1000
   -historyverboselevel 1
   -keepalivemessageinterval 300
   -logintimeout 15
   -maxbcpthreads 1
   -maxdeliveredtransactions 0
   -pollinginterval 5000
   -querytimeout 1800
   -skiperrors
   -transactionsperhistory 100
2005-10-19 12:33:09.037 Connecting to Subscriber 'YKCLNSD'
2005-10-19 12:33:09.037 Agent message code 20084. The process could not connect to Subscriber 'YKCLNSD'.
2005-10-19 12:33:09.053 Category:NULL
Source:  Microsoft SQL Native Client
Number:  18456
Message: Login failed for user 'DM_MONTYNT\CLG_Distributor'.

 



I couldn't understand what I was doing wrong.  "DM_MONTYNT\CLG_Distributor" was a member of the "db_owner" role in both the "distribution" database and the Subscriber database.  (I am using a PUSH Subscription.)

I found a clue when I re-read the Books Online topics about Replication security.  "DM_MONTYNT\CLG_Distributor" also needed to be a member of the Publication Access List (PAL).  The problem was, the only way you can add a user account to the PAL is to (1) grant the account access to the Publisher database and (2) define the account on the Publisher server and the Distributor server.  (In my case, the Publisher and Distributor are the same server.)  Since I had not granted this account such access, I could not add this account to the PAL.

This created a problem, because I didn't want to grant "DM_MONTYNT\CLG_Distributor" access to the Publisher database.  I figured if Microsoft wanted me to grant the account for the Distribution Agent access to the Publisher database, they would have said so in the "Replication Agent Security Model" article in Books Online.  I guess I'm just naive.  ;-)

However, when I took a closer look at the PAL, I noticed that one of those new Windows groups ("SQLServer2005MSSQLUser$YKCLNSD$MSSQLSERVER") was already a member of the PAL.  "DM_MONTYNT\CLG_Distributor" was not currently a member of that group, so I decided to add it.  As soon as I added this account to this Windows group, replication worked.  It has continued to work ever since.

So if you're having security problems, take a look at these new Windows groups and see if they are tied into the feature you are working with.  They may be the key to unlocking the problem.


Reply:
Hi Ken,

1) It is not recommended to put the 'normal' account into the new Windows groups such as "SQLServer2005MSSQLUser$YKCLNSD$MSSQLSERVER" or SQLServer2005SQLAgentUser$YKCLNSD$MSSQLSERVER. These account are used to hold the SQL or SQLAgent service account. These groups are in sysadmin fixed server role. So in your case, you essential assign "DM_MONTYNT\CLG_Distributor" as the admin of the SQL server. (actually, it is more powerful than SQL sysadmin because it has the same Windows permission as the service account. see http://msdn2.microsoft.com/en-us/library/ms143504(en-US,SQL.90).aspx#Review_NT_rights for more information)

2) For PAL, all logins that are members of the sysadmin fixed server role or the db_owner fixed database role on the publication database can subscribe to a publication by default without being explicitly added to the PAL. And the logins in sysadmin role are by default in PAL (see http://msdn2.microsoft.com/en-us/library/ms151153(en-US,SQL.90).aspx). That is why replication works in your case.

3) You need to grant (and only grant) "DM_MONTYNT\CLG_Distributor" access to the Publisher database to add this account to PAL. This gives this account minimal access to the publisher DB while still make replication works.

Please let me know if you have further questions.
-Peng

------------------------------------
Reply:

Thanks for the additional information on these groups. 

I now want to clarify the steps you had me take for the benefit of other people reading this message:
1) We are now granting the Distribution Agent account access to the published database
2) The account's access is limited to membership in the "public" group only
3) Once the account was granted access to the database, I was able to add it to the PAL
4) I removed the account from the Windows group
5) I restarted the Distribution Agent just to be sure there were no problems

So far so good.  However, I strongly recommend that Microsoft revise it's configuration instructions.

Here's what the article currently says:

Distribution Agent for a Push Subscription
The Windows account under which the agent runs is used when making connections to the Distributor. This account must:

  • At minimum be a member of the db_owner fixed database role in the distribution database.
  • Be a member of the PAL.
  • Have read permissions on the snapshot share.
  • Have read permissions on the install directory of the OLE DB provider for the Subscriber if the subscription is for a non-SQL Server Subscriber.
  • The account used to connect to the Subscriber must at minimum be a member of the db_owner fixed database role in the subscription database (or have equivalent permissions if the subscription is for a non-SQL Server Subscriber).

Distribution Agent for a Pull Subscription
The Windows account under which the agent runs is used when making connections to the Subscriber. This account must at minimum be a member of the db_owner fixed database role in the subscription database.

The account used to connect to the Distributor must:

  • Be a member of the PAL.
  • Have read permissions on the snapshot share.

I would revise "Be a member of the PAL." by adding a note that you need only make the account a member of the "public" role in the published database in order to add the account to the PAL.  Even if this information exists elsewhere in Books Online, there should be a reference here just so people don't have to go crazy searching for it.


------------------------------------
Reply:
Thanks for the feedback. I will forward your suggestion to the technical writer so he can use it to improve BOL.

-Peng

------------------------------------

Setup is Split Across Multiple CDs

Setup is Split Across Multiple CDs Lately I've seen a bunch of people hitting installation errors that have to do with the fact th...