casinos

Posts Tagged ‘Cloud’

Throttling is inherent / inevitable for Cloud deployments September 3rd, 2014

Vinod Kumar

My good friend and mentor in many ways Govind Kanshi did write about his learning of working on Cloud for ISV’s and he calls out a number of caveats. One of the hidden gem is the article is the concept of throttling.

We keep talking about this to our customers and from time to time I start getting requests how to solve some of the error messages developers are getting as part of new deployment to cloud. Recently saw an error message from a customer:

Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

They said they were working on a batch process and getting this error from time-to-time. The general tendency was to ask, “what have you done in your environments to solve this?” The common answer I get is, adding a connection string property: Connection Timeout=300 or some high number.

Ahhh !!! Do you get it? Is there a problem? All these are great solutions when you are working with SQL Server On-premise and makes a lot of sense. If you are working on Azure SQLDB (any cloud vendor), then you need to understand the behaviour of throttling :).

Everything on the cloud has limits and there are few restrictions one needs to adhere. Cloud dynamics is all about sharing resources, not having run-away of resources by one connections and monetizing using scale sharable model of resources.

So customer always asks, “what should I do now?” What numbers should I be aware off? In this particular case, I generally say to customer:

  1. Try to make transactions smaller and do it in batches instead of a big ONE transaction for the whole batch.
  2. There are limits to how long a transaction can be open like ~30mins. So this can be avoided if we do the above.
  3. If you are doing a lot of TempDB interactions or DMLs within the same transaction boundary, there are limits to ~2GB for the size of the transaction one can have. Doing the Pt 1, we can avoid that too.
  4. Irrespective of all the above, try to build a retry logic in your application / DAL layer to handle this behaviour.

Please read the WIKI page to understand more about the limits, they are worth a note. Also learn about how the SQL Engine applies the throttling in the Azure world.

The basics don’t change irrespective of where you work in cloud. It is important to understand them and be prepared in our code upfront. Have you built retry logics in your DAL anytime, share us your experience.

Continue reading...


 

Azure Bytes: Creating an Azure DB using Quick Create September 1st, 2014

Vinod Kumar

I have been wanting to create this content concept for a long time. I am going to call this as “Azure Bits-N-Bytes” series. The idea here is to create short videos on Azure related topics (ideally lesser than 5 mins) and share it.

In this video series I am talking about how to create an SQL Azure DB using the portal quick create option.

As you can see this is the simplest option one can have in creating a SQLDB on Azure. As easy as it can be, there are a number of options we will need to play around with after this to access this DB. We will discuss the same in future posts.

Do let me know if you are interested in these type of quick videos in the future too. Based on your feedbacks, we will increase the frequency of such posts.

Continue reading...


 

Getting started with PowerShell and Azure August 21st, 2014

Vinod Kumar

I am a big time supporter of PowerShell and automation when it comes to working with Azure. In the recent past, I have seen SQL Developer ask me – how I can start playing with PowerShell for Azure. Though the concepts of powershell are awesome and I use them inside SQL Server, it is always difficult to get them ready on their machines. Often, the question is – where do I start? This blog will get you started with installing and configuring your machine with PS and Azure.

We can install Azure PowerShell modules by running the Microsoft Web Platform Installer. I have initiated the same and go through the initial wizard.

We can see the installation progress and once done, we will be presented with information to various components that get installed as part of this.

Now to use PowerShell with out Azure subscription, we need to link our account. If you are using an organizational account, then use the Add-AzureAccount command from the Windows Azure PowerShell command. This is useful ONLY when we are using Azure AD for our account. If you are not using Azure AD, then move to the next step.

In most cases, we are going to have standard Live Account associated with our Subscription. In this case, we need to download the certificate. To do this,  invoke the command Get-AzurePublishSettingsFile command from the PowerShell command prompt.

We will be prompted with a save dialog and save this certificate file in a secure location. The next step is to Import the same so that we can start using PowerShell from our local machine to work with our subscription remotely.

For this we will invoke the – Import-AzurePublishSettingsFile “Path of certificate file”

Once this is done, we are connected and we will get an confirmation of our current subscription. being used.

We can quickly get details about our current subscription using the command Get-AzureSubscription as it outlines some of the information.

Now we are all set with are getting started with PowerShell and Azure. In future posts we will use this configuration as starting point to play around creating new VMs, Stopping VMs, Deleting VMs, Creating other services and much more. Stay tuned !!!

Continue reading...


 

Moving data to Cloud–Considerations and thoughts August 20th, 2014

Vinod Kumar

Being a data person and meeting customers almost every single day on considerations of movement to Azure (Cloud), I take the opportunity to demystify a lot basic concepts with customers. Post the session, it is about thinking in the right direction while taking data to the cloud. This blog reflects the lessons learned best practices, approaches and understanding from real world experiences. There are not written on stone sort of recommendations, but these approaches I highly recommend customers to set their minds at while taking data to cloud.

Know your workload

The basic conversation here is to identify if the application is a “Green Field” project or an “Existing Application”. In a “Green Field” project, we will be exposing data via a service for the first time. Like a literal green field, we are dealing with an undeveloped environment and don’t have any expectations, limitations, dependencies or other constraints that would result from having an existing service.

For existing applications, services may reside on our on-premise servers, partner hosted environment or third-party cloud environments. In such scenario, there may be a need to move, scale, or provide a redundant environment for these services. When such applications come, I look at asking if the end-user open on opening up just a subset of of existing data as a new service on cloud.

When working movement to Cloud, I generally ask if the customer wants to migrate as-is their current deployment or migrate data / services as an opportunity to enhance it providing a super set of functionality found in the original application.

If you already have an existing application, but it is reaching the limits of scale and is difficult or impossible due to technology or financial based constraints to scale – one option is to use the cloud as a “scale layer”. In this scenario, cloud infrastructure can be put in front of your existing data services to provide significant scale. This is yet another workload to look at.

Exposing your data

There can be different levels of motivations when exposing your data via API’s. I generally quiz customers around this – is it compliance requirement, is it self-motivated for others to integrate or is it a requirement from consumers. Based on this, the discussion will decide if it is need to have, want to have or nice to have.

Irrespective how the data gets exposed by applications, there are few more analysis we do, these are some typical queries we ask:

  1. Data Location: Are there constraints in location of where data can be stored? Is geography a contraint?
  2. Data type and Format: Are there any standards which mandate data to be transmitted in XML, JASON, Binary, Text, OData, DOM, Zip, Images format etc?
  3. Sharing: Is there a specific API to be implemented like WebAPI, WCF, Web Services, FTP, SAPI etc.
  4. Consumption: Are there requirements to retrieve the data in its entirety or be able to query the data and retrieve only a subset? Is queryability a required and do we need to certify the same?
  5. Commercial Use: It is critical to understand if it is internal systems consuming the data or available for commercial use? Is there a need to monetize the access of data and services delivered?
  6. Data discoverability: Is there a need to expose data and advertised somehow? Should it integrate with gateways, govt services or third party data services like Azure DataMarket?

Know your Data

There are multiple technologies which can be used to host data, each with their own pros, cons, and pricing. Data size is relevant as there can be key factors when determining which technology should be used to host the data. In addition to understanding the current data size, it’s important to also understand the data growth rate. Evaluating the data growth rate can be used to extrapolate storage needs in future years and help scope the technology used.

Know how often the data gets updated. If customer wants to expose data to public, then there maybe sync or out-of-sync updates to source data and data exposed via services.

Putting a copy of your data into the cloud requires the actual transfer of data from your location to the datacenter of cloud hosting provider. These transfers require time and typically have associated bandwidth charges that should be considered – both for the initial transfer of the current state of your data set and the periodic refreshes of data.

Sharding is something that most cloud storage mechanisms support. When looking to move large data sets to the cloud, consider sharding and think about the optimal ways to partition data for optimized for query performance. Examples of different types of sharding include sharding by region, state, by time period (month, year etc.), postal code, customer’s name etc.

Query pattern is yet another dimension we need to keep in mind. Since you pay based on bandwidth utilization, it is critical to understand the query patterns and how queries come to access data. Since we are paying for the data usage, it is important to understand the amount of data we transfer in every request, the number of columns queried etc.

As we build our applications, it is important to understand if there are any Exporting requirements for other systems to consume. In an hybrid scenario, we will need to run queries that result in the export of files to the local file system and then those must be transported and imported into a cloud store. Depending on the cloud technology being used for storage, the import may be trivial or it may require the creation of custom code or scripts to export the data from the source and import it into the cloud.

Are you into data selling business?

There have been very less customers who are in a B2C scenario, but for those who are on this business – need to understand there is new costs to handle while servicing data as a service. It is no more a freebie. Even though cloud computing can greatly reduce costs, there are still costs involved to host, load and support a data service.

As these costs add up, companies look at a minimum of Cost Recovery rather than Profit in the first run. They look at optimizing it later to make profit by volumes rather than profits out of single transactions. I have seen customers adopt different strategies for pricing in a Data as a Service or Software as a Service strategy. There is no one right way or the other -

  1. Tiered Pricing – This is like mobile plans, you charge by standard transactions rates per month, time, data etc.
  2. Pay as you go pricing – this is something to borrow from any cloud vendor charging.
  3. Peak Pricing and Non-Peak Pricing – We can have different pricing for different times of the day.
  4. One time usage Subscription – It is something like a trial costing.

There are many of these mechanisms customers adopt in their strategy, but these are just representative methods one might use.

Conclusion

I seriously hope I have outlined some thoughts in your mind as you migrate your workload to cloud or enable your data on cloud. Cloud is no different than what you do on-premise – just that we need to be double careful and take few extra steps as we enable our customers. In future posts, there will be more thoughts that I will try to bring out.

Continue reading...


 

Creating Excel Interactive View July 24th, 2014

Vinod Kumar

I have been wanting to write on this topic for ages but seem to have missed out for one reason or the other. How many times in your life seen a web page with a bunch of tables and it is so boring to read them? The numbers or tables sometimes might have sorting capability but lacks a striking visualization to say the least. So I am going to borrow a table from a Wikipedia page about Indian Population. There are a number of tables and the table of interest to be in Literacy rate. So the rough table looks like:

State/UT Code India/State/UT Literate Persons (%) Males (%) Females (%)
01 Jammu and Kashmir 86.61 87.26 85.23+-
02 Himachal Pradesh 83.78 90.83 76.60
03 Punjab 76.6 81.48 71.34
04 Chandigarh 86.43 90.54 81.38

Well, this is as boring as it can ever get even when pasted as-is on this blog. Now here is the trick we are going to do called as Excel Interactive View. As the name suggests, we are going to use the power of Excel to make this mundane table into some fancy charts for analysis. This includes a couple of scripts that needs to be added as part of the HTML Table and we are done. It is really as simple as that. So let me add the complete table with the script added. Just click on the button provided above to see the magic:

State/UT Code India/State/UT Literate Persons (%) Males (%) Females (%)
01 Jammu and Kashmir 86.61 87.26 85.23+-
02 Himachal Pradesh 83.78 90.83 76.60
03 Punjab 76.6 81.48 71.34
04 Chandigarh 86.43 90.54 81.38
05 Uttarakhand 79.63 88.33 70.70
06 Haryana 76.64 85.38 66.77
07 Delhi 86.34 91.03 80.93
08 Rajasthan 67.06 80.51 52.66
09 Uttar Pradesh 69.72 79.24 59.26
10 Bihar 63.82 73.39 53.33
11 Sikkim 82.20 87.29 76.43
12 Arunachal Pradesh 66.95 73.69 59.57
13 Nagaland 80.11 83.29 76.69
14 Manipur 79.85 86.49 73.17
15 Mizoram 91.58 93.72 89.40
16 Tripura 87.75 92.18 83.15
17 Meghalaya 75.48 77.17 73.78
18 Assam 73.18 78.81 67.27
19 West Bengal 77.08 82.67 71.16
20 Jharkhand 67.63 78.45 56.21
21 Odisha 72.9 82.40 64.36
22 Chhattisgarh 71.04 81.45 60.59
23 Madhya Pradesh 70.63 80.53 60.02
24 Gujarat 79.31 87.23 70.73
25 Daman and Diu 87.07 91.48 79.59
26 Dadra and Nagar Haveli 77.65 86.46 65.93
27 Maharashtra 83.2 89.82 75.48
28 Andhra Pradesh 67.66 75.56 59.74
29 Karnataka 75.60 82.85 68.13
30 Goa 87.40 92.81 81.84
31 Lakshadweep 92.28 96.11 88.25
32 Kerala 93.91 96.02 91.98
33 Tamil Nadu 80.33 86.81 73.86
34 Puducherry 86.55 92.12 81.22
35 Andaman and Nicobar Islands 86.27 90.11 81.84

So how cool is this Excel visualisation? I am sure you will want to build or use this capability in your webpages or internal sites in your organizations too. I hope you learnt something really interesting.

If you want to learn more about using this feature in your dataset and web pages, well read the documentation from Excel Interactive View.

PS: the data comes from Wikipedia and I have just used a snapshot to show the same. So please dont read too much into the data etc, look at the Excel view capabilities.

Continue reading...