Http/Https access your SSAS Cubes

As you know, the default port for SSAS Cube is 2382, which means to accept the requests from any client to access the cubes needs to guarantee both the outbound port 2382 on clients and the inbound port 2382 on server open. Otherwise, the connectivity test cannot pass.

In some cases, e.g., my company, we only open http port(80) and https port(443) and other ports are natively closed by firewall rules for security concern. In order to make sure the end users being able to connect to SSAS Cube, requesting to open the firewall rule is required but it usually takes time(2~3 weeks). Any workaround can resolve this issue? Can we make the clients to access SSAS Cube using  80 or 443 ports?

The short answer is YES. This post is to talk about the solution. The key point here is to copy the executable files in IIS folder and expose this website to be accessible. When the coming requests hit this website endpoint (80/443), the http handler will use the executable(MSMDPUMP.dll) to cope with the requests.

By default, the website is to use 80 port. In order to expose as a secure endpoint(443), to request a certificate is necessary.

Detailed steps, please follow this thread. http://msdn.microsoft.com/en-us/library/gg492140.aspx

Nice to mention, if you don’t want to expose the URL like this “http://my-web-srv01/OLAP/msmdpump.dll”, in website configuration, you can setup “msmdpump.dll” as a default page., then it ends up with this URL more elegant “http://my-web-srv01/”. Life goes much easier.

http SSAS

Big Data Start–Batch in and Batch out

Let’s say somehow you already built out your big data platform–it could be Hortonwroks sandbox, Hortonworks on windows , claudera sandbox or HDinsights. Now, what is next?

Usually, you can challenge yourself to push data into this big data platform(HDFS) and fetch them out. This should be good start for you to understand this ECO system.

Say data in first. The easiest way is to use the Hadoop Shell: “CopyFromLocal”, this command can help you to copy the raw files from your local to your HDFS (Big Data File System). Or the other option you have is to use “Sqoop”. Sqoop can help Put the data from Relational Databases into HDFS.

Sqoop tutorial: https://sqoop.apache.org/docs/1.4.1-incubating/SqoopUserGuide.html

Now, you got data in your Big Data system esp. in HDFS. How to present these data out? well, in this market, the only tool I know is Excel Power Query, which is a free add-on for excel. Once you install this add-on, you have the option to connect HDFS to fetch the data. More details, please check out this post. http://joeydantoni.com/2013/08/15/power-query-and-hadoop-2/

Keep tune, I am back for more big data discussions.

Was speaking in SQL Saturday Dallas BA Edition

Spoke Power BI in Dallas SQL Saturday BA Edition(http://www.sqlsaturday.com/schedule.aspx).

Before this one, I have spoken/attended more than 5 times SQL Saturday in different places–MSP, Madison, Richmond, WDC and Dallas BI. I was surprised that the quality of sessions and speakers is really high–maybe highest one among those I attended. I did learn a lot from each session I listened to.

here are some thoughts, 

1, Cloud Cloud and Cloud… Cloud will be the future for all IT

2, BIG Data, it is no longer a buzzword but also practically we can do something in BIG DATA.

3, En-powering your end users is the way to go for IT. All ad-hoc report tools are ready and easy for Data analysts to quickly pick up.

 

  

Automation Scripts for Your Tabular Cube Deployment

If time permits, I would love to make all deployment process in an automation mode.

A number of reasons support us(BI developer maybe for all IT professionals) to do so. Saving time is the first one. Thinking about the case of deploying SSRS Reports, Cubes, …the deployment scripts can make the deployment just one click away. When the deployment goes to many environments, this benefit will be more outstanding. In addition, saving money! Why? the license fee for tools like BIDS is for developers not for release professionals. If somehow there is a dedicated the person to control the release and we make the scripts ready for him, then we can save some bucks for buying BIDS/SSDT for the release team and even it is not unnecessary for them to spend learning time on how to use the tools like visual studio, BIDS, …. Further, Making an UI tool to wrap up your scripts and grant the permission to team tech leads, your employer can even save the money for hiring the release person or may expand the release team’s bandwidth largely.

Knowing this drive me , each time, to find a way to make my deployment process as automatic as possible. Here is a quick tip for you to learn how to make SQL 2012 tabular Cube deployment automatic. In the age of SQL2005/2008/2008R2, we have “ascmd.exe”. Now, it has gone and what we have is a more powerful tool “Powershell” and “Invoke-ascmd” command. 

The syntax is quite simple and changes less comparing with ASCMD. My basic deployment scripts looks like below

Invoke-ASCMD -Server “…” ` <–Your AS Server name

                         -Inputfile “…..”<–Use your Cube deployment wizard to generate this file

                         -Tracefile “…..” <—-Output the results

Try this out, buddy.

 

 

 

 

SQL Job Error: Failed to notify “operator” via email.

I believe it is very often for DBA(s) and Developer(s) to encounter this issue even though some of them are senior. If you are one of them, don’t worry, here is the proven approach to work out of this issue.

In order to resolve this issue, please followng these steps,

#1, Setup DB mail

#2, Configure on property of SQL Agent through SSMS windows, use DB mail you configue for SQL Agent job

#3, Create operators if you haven’t

#4, In your jobs, setup notifiation.

Done, Have fun guys. Happy Holiday.

Derek Dai @TX

Chrismas 2014

 

  

Reporting Service Trouble Shooting #1–failed to deploy large RDL file.

From now on,  I will collect some interesting technical issues here. As you can see, I break down them into different buckets like SSRS, SSIS, SSAS, PowerBI, Powerpivot….

This thread starts the series on trouble shooting. (BTW, you will see “design tips” series too very soon.)

Well, this was an interesting issue because I never encountered it before even I have already SSRS or more than 7 or 8 years. Alright, let us talk about this issue.

Did you see this error before, see below screen shot?

Error

RS complains “Maximum Request Length Exceed” when you deploy reports to RS portal. Actually, I like this error, pretty concise and specific, which is much better than some errors with a general verbage.

My gut feeling for this error is something wrong in file size as it says “Request length exceeded”, so let’s check out the file size of this RDL file. It is 5.8 MB, well, a little bit bigger than normal files.

filesize

 

What the default maximum value for uploading/deploying RDL file? 4MB. yes, it is 4 MB. That’s why I got that error.

How to resolve it? Add an attribute in the file “Web.config”. Steps as below,

Browse to the RS installation folder(“C:\Program Files\Microsoft SQL Server\MSRS10_50.MSSQLSERVER\Reporting Services\ReportServer\”, you should see this file “web.config”. Open it and add(Yes, “ADD”. It was not there) the attribute as below

maxRequestLength=”500000″ in the section of “httpRuntime”. (this size depends on you. 500000 here works well for my case.) It is Kilobyte.

Web

 

Now, someone would say you need to restart IIS to make sure this takes effect. Hum..really. It is right for SQL2000 and SQL2005 but it no longer be true after SQL2008(including SQL2008). My case, I am using SQL 2008 R2, it takes effect immediately after this change.

My friends, take care, have fun.