Both BIDS(Business Intelligence Development Studio) and Data Tools, BI development tools, in design phase, cannot co-work with 64BIT Oracle drive. Why? As you may notice, even what you installed is 64bit SQL Server, but these two BI tools are still a 32 bit application. With this reason, when you develop the SSIS package, in BIDS/DT, the package cannot recognize 64bit oracle drive.
The solution is to install 32bit oracle drive when you develop the SSIS packages. With 32 bit drive, you can debug the package and run the package in BIDS/DT. Does it mean when moving the package to other environments like QA/PD…you still need to use 32bit oracle drive? No. When you use SQL Agent job to wrap up the finished SSIS packages, it is highly recommended to use 64bit oracle dirve because SQL Agent Services is a 64bit applcaition. It works well with 64 bit drive. Can it(SQL Agent Job) work with 32 bit oracle drive? Yes, but you need specifically setup your jobs running in 32 mode? What is the down side if it is running in 32 bit mode? There is restriction to leverage the memory resources on server.
So, the conclusion is , in development phase, you have to install Oracle 32bit drive. When it pushes to QA/PD, it is recommended to have 64bit in place as your SQL Agent job can work well with it.
Hope it helps!
As you know, the default port for SSAS Cube is 2382, which means to accept the requests from any client to access the cubes needs to guarantee both the outbound port 2382 on clients and the inbound port 2382 on server open. Otherwise, the connectivity test cannot pass.
In some cases, e.g., my company, we only open http port(80) and https port(443) and other ports are natively closed by firewall rules for security concern. In order to make sure the end users being able to connect to SSAS Cube, requesting to open the firewall rule is required but it usually takes time(2~3 weeks). Any workaround can resolve this issue? Can we make the clients to access SSAS Cube using 80 or 443 ports?
The short answer is YES. This post is to talk about the solution. The key point here is to copy the executable files in IIS folder and expose this website to be accessible. When the coming requests hit this website endpoint (80/443), the http handler will use the executable(MSMDPUMP.dll) to cope with the requests.
By default, the website is to use 80 port. In order to expose as a secure endpoint(443), to request a certificate is necessary.
Detailed steps, please follow this thread. http://msdn.microsoft.com/en-us/library/gg492140.aspx
Nice to mention, if you don’t want to expose the URL like this “http://my-web-srv01/OLAP/msmdpump.dll”, in website configuration, you can setup “msmdpump.dll” as a default page., then it ends up with this URL more elegant “http://my-web-srv01/”. Life goes much easier.
Let’s say somehow you already built out your big data platform–it could be Hortonwroks sandbox, Hortonworks on windows , claudera sandbox or HDinsights. Now, what is next?
Usually, you can challenge yourself to push data into this big data platform(HDFS) and fetch them out. This should be good start for you to understand this ECO system.
Say data in first. The easiest way is to use the Hadoop Shell: “CopyFromLocal”, this command can help you to copy the raw files from your local to your HDFS (Big Data File System). Or the other option you have is to use “Sqoop”. Sqoop can help Put the data from Relational Databases into HDFS.
Sqoop tutorial: https://sqoop.apache.org/docs/1.4.1-incubating/SqoopUserGuide.html
Now, you got data in your Big Data system esp. in HDFS. How to present these data out? well, in this market, the only tool I know is Excel Power Query, which is a free add-on for excel. Once you install this add-on, you have the option to connect HDFS to fetch the data. More details, please check out this post. http://joeydantoni.com/2013/08/15/power-query-and-hadoop-2/
Keep tune, I am back for more big data discussions.