Document Map’s secret

Document map is one of my favorite features in SSRS especially for nested document map.

In SSRS, document map makes the report navigation in report much easier. After design the document map in report, It comes with an “index” section in the left frame when you view the report, example shown as below.

DM1

If you only setup the document map as a bookmark, that is not something I wanna talk today as it is really straightforward. My focus today is on “nested” document map. First of all, what do I mean “Nested”. By examining the example above, Category(“Bike”) as the first level holds “Subcategory”(“Road Bikes”) inside of it’s body, further, Subcategory as a group embraces the model (“Road-150”) . And under the umbrella of “Model”, there is a list of products. This is what I mean “nested”. When it comes to implementation next, how to achieve this? I hope my draft picture below can illustrate the idea. As the picture shows, the implementation in SSRS is to use list/table to contain another list/table inside and setup document map on each group level in each container(list/table).

DM

Hope I have touched the right point and you take something useful away.

Advertisements

Why BIDS only works with Oracle 32bit Client Drive?

Both BIDS(Business Intelligence Development Studio) and Data Tools, BI development tools, in design phase, cannot co-work with 64BIT Oracle drive. Why? As you may notice, even what you installed is 64bit SQL Server, but these two BI tools are still a 32 bit application. With this reason, when you develop the SSIS package, in BIDS/DT, the package cannot recognize 64bit oracle drive.

The solution is to install 32bit oracle drive when you develop the SSIS packages. With 32 bit drive, you can debug the package and run the package in BIDS/DT. Does it mean when moving the package to other environments like QA/PD…you still need to use 32bit oracle drive? No. When you use SQL Agent job to wrap up the finished SSIS packages, it is highly recommended to use 64bit oracle dirve because SQL Agent Services is a 64bit applcaition. It works well with 64 bit  drive. Can it(SQL Agent Job) work with 32 bit oracle drive? Yes, but you need specifically setup your jobs running in 32 mode? What is the down side if it is running in 32 bit mode? There is restriction to leverage the memory resources on server.

So, the conclusion is , in development phase, you have to install Oracle 32bit drive. When it pushes to QA/PD, it is recommended to have 64bit in place as your SQL Agent job  can work well with it.

Hope it helps!

Http/Https access your SSAS Cubes

As you know, the default port for SSAS Cube is 2382, which means to accept the requests from any client to access the cubes needs to guarantee both the outbound port 2382 on clients and the inbound port 2382 on server open. Otherwise, the connectivity test cannot pass.

In some cases, e.g., my company, we only open http port(80) and https port(443) and other ports are natively closed by firewall rules for security concern. In order to make sure the end users being able to connect to SSAS Cube, requesting to open the firewall rule is required but it usually takes time(2~3 weeks). Any workaround can resolve this issue? Can we make the clients to access SSAS Cube using  80 or 443 ports?

The short answer is YES. This post is to talk about the solution. The key point here is to copy the executable files in IIS folder and expose this website to be accessible. When the coming requests hit this website endpoint (80/443), the http handler will use the executable(MSMDPUMP.dll) to cope with the requests.

By default, the website is to use 80 port. In order to expose as a secure endpoint(443), to request a certificate is necessary.

Detailed steps, please follow this thread. http://msdn.microsoft.com/en-us/library/gg492140.aspx

Nice to mention, if you don’t want to expose the URL like this “http://my-web-srv01/OLAP/msmdpump.dll”, in website configuration, you can setup “msmdpump.dll” as a default page., then it ends up with this URL more elegant “http://my-web-srv01/”. Life goes much easier.

http SSAS

Big Data Start–Batch in and Batch out

Let’s say somehow you already built out your big data platform–it could be Hortonwroks sandbox, Hortonworks on windows , claudera sandbox or HDinsights. Now, what is next?

Usually, you can challenge yourself to push data into this big data platform(HDFS) and fetch them out. This should be good start for you to understand this ECO system.

Say data in first. The easiest way is to use the Hadoop Shell: “CopyFromLocal”, this command can help you to copy the raw files from your local to your HDFS (Big Data File System). Or the other option you have is to use “Sqoop”. Sqoop can help Put the data from Relational Databases into HDFS.

Sqoop tutorial: https://sqoop.apache.org/docs/1.4.1-incubating/SqoopUserGuide.html

Now, you got data in your Big Data system esp. in HDFS. How to present these data out? well, in this market, the only tool I know is Excel Power Query, which is a free add-on for excel. Once you install this add-on, you have the option to connect HDFS to fetch the data. More details, please check out this post. http://joeydantoni.com/2013/08/15/power-query-and-hadoop-2/

Keep tune, I am back for more big data discussions.

Was speaking in SQL Saturday Dallas BA Edition

Spoke Power BI in Dallas SQL Saturday BA Edition(http://www.sqlsaturday.com/schedule.aspx).

Before this one, I have spoken/attended more than 5 times SQL Saturday in different places–MSP, Madison, Richmond, WDC and Dallas BI. I was surprised that the quality of sessions and speakers is really high–maybe highest one among those I attended. I did learn a lot from each session I listened to.

here are some thoughts, 

1, Cloud Cloud and Cloud… Cloud will be the future for all IT

2, BIG Data, it is no longer a buzzword but also practically we can do something in BIG DATA.

3, En-powering your end users is the way to go for IT. All ad-hoc report tools are ready and easy for Data analysts to quickly pick up.

 

  

Automation Scripts for Your Tabular Cube Deployment

If time permits, I would love to make all deployment process in an automation mode.

A number of reasons support us(BI developer maybe for all IT professionals) to do so. Saving time is the first one. Thinking about the case of deploying SSRS Reports, Cubes, …the deployment scripts can make the deployment just one click away. When the deployment goes to many environments, this benefit will be more outstanding. In addition, saving money! Why? the license fee for tools like BIDS is for developers not for release professionals. If somehow there is a dedicated the person to control the release and we make the scripts ready for him, then we can save some bucks for buying BIDS/SSDT for the release team and even it is not unnecessary for them to spend learning time on how to use the tools like visual studio, BIDS, …. Further, Making an UI tool to wrap up your scripts and grant the permission to team tech leads, your employer can even save the money for hiring the release person or may expand the release team’s bandwidth largely.

Knowing this drive me , each time, to find a way to make my deployment process as automatic as possible. Here is a quick tip for you to learn how to make SQL 2012 tabular Cube deployment automatic. In the age of SQL2005/2008/2008R2, we have “ascmd.exe”. Now, it has gone and what we have is a more powerful tool “Powershell” and “Invoke-ascmd” command. 

The syntax is quite simple and changes less comparing with ASCMD. My basic deployment scripts looks like below

Invoke-ASCMD -Server “…” ` <–Your AS Server name

                         -Inputfile “…..”<–Use your Cube deployment wizard to generate this file

                         -Tracefile “…..” <—-Output the results

Try this out, buddy.