The Microsoft Azure team recently announced significant (up to 72%) discounts for customers willing to make one- to three-year reservations. Reserved Instances (RI) are not new, of course – Amazon Web Services (AWS) has had RI for a long time. But are there differences in how the Azure team rolled out RI?
For example: When does it really make sense to use an RI? Can RI discounts be combined with unique offers, like Azure Hybrid Benefit? Can customers cancel their reservation or exchange reserved VM types? What are some of best practices when making these decisions about one-year vs. three-year reservations?
We tried to answer many of these questions in the slide deck below, which we prepared for an internal briefing.
Hope this helps, and please let us know if you have any additional questions in the comments below!
Specifically, this application was designed to help analysts get personalized recommendations (based on their own preference settings, ratings provided by their co-workers) for stories they need to analyze as part of their daily work.
The demo included in this video was part of our Ignite talk on cloud innovation with Azure Government. We use CNTK for an image detection problem: Identifying objects within the refrigerator. Image detection is a harder class of problem than image classification, as image detection goes beyond classification to include localization of object(s) within an image. This is the reason for dropping down into the deep learning library. (Earlier in this presentation, Steve Michelotti showed the use of Cognitive API for image classification.)
In a nutshell, we took the Marvel Universe Social Database and loaded it in Azure Cosmos DB as a graph database. Then we built a simple web page that invoked Gremlin queries against Cosmos DB.
The key theme of this demo is the ease with which you can create a globally distributed database that can support low latency queries against the Marvel Universe graph database. In the context of AzureGov (as shown below), we can seamlessly replicate the data across the three AzureGov regions by clicking on these regions within the Azure portal.
At our second session for Microsoft Ignite, Jason McNutt and I discussed Azure Resource Manager (ARM) and Compliance. We showed attendees how to develop ARM templates that are compliant out of the box, with security standards such as FISMA and FedRAMP. Additionally, we went over how to automatically generate security control documentation based on ARM tags and open-source libraries like OpenControl.
Below is a short 15-minute video summarizing our Secure DevOps with ARM presentation:
Steve Michelotti and I presented a session on AzureGov last week at Microsoft Ignite 2017 in Orlando. It focused on demonstrating the innovative capabilities in AzureGov that are specifically designed to help government agencies with their mission. We dedicated about 80% of the session to live demos.
Steve started out with a brief description of AzureGov and how to get started…along with some recent news announcements, including API Management and Key Vault. Steve then quickly transitioned into demos related to Cognitive Services, Azure IOT and Power BI. I conducted two demos related to Cosmos DB Graph database and the CNTK deep learning algorithm on an N Series GPU machine.
Please watch the video below and let us know if you have any questions.
In an earlier blog post, we talked about Excel as custom calculation engine. In a nutshell, a developer or power user can author the calculation logic inside an Excel workbook and then execute the workbook programmatically via either Excel Services or HPC Services for Excel. You can read about this approach in detail in our MSDN article. This approach has been successfully used by our customer on a large scale for many years now.
Lately though, we’ve been thinking about Jupyter Notebooks as another potential option for building custom calculation engines.
But before we make the case, let’s review some background information on Jupyter Notebooks. Read More…
At the Microsoft BUILD 2017 Day One keynote, Harry Shum announced the ability to customize the vision API. In the past, the cognitive vision API came with a pre-trained model. That meant that as a user, you could upload a picture and have the pre-trained model analyze it. You can expect to have your image classified based on the 2,000+ (and constantly growing) categories that the model is trained on. You can also get information such as tags based on the image, detect human faces, recognize hand-written text inside the image, etc.
But what if you wanted to work with images pertinent to your specific business domain? And what if those images fall outside of the 2,000 pre-trained categories? This is where the custom vision API comes in. With the custom vision API, you can train the model on your own images in just four steps: Read More…
Azure Role-Based Access Control (RBAC) offers the powerful ability to accord permissions based on the principle of “least privilege.” In this short video, we extend the idea of Azure RBAC to implement a JIT (just in time) permission control. We think a JIT model can be useful for the following reasons:
1) Ability to balance the desire for “least privilege” with the cost of managing an exploding number of fine-grained permission rules (hundreds of permission types, combined with hundreds of resources).
2) Allow coarse-grained access (typically DevOps teams need access to multiple services) that is “context aware” (permission is granted during the context of a task).
Of course JIT can only be successful if its accompanied with smart automation (so users have instant access to permissions that they need and when they need them).
Interested? Watch this 15-minute video that goes over the concepts and a short demonstration of JIT with Azure RBAC.
Over the years, AIS has leveraged “Excel on Server” to enable power users to develop their own code.
Consider a common requirement to implement calculations/reports that adhere to the Financial Accounting Standards Board (FASB) standards. These types of reports are often large and complex. The calculations in the reports are specific to a geographical region, so a multi-national company needs to implement different versions of these calculations. Furthermore, over time these calculations have to be adjusted to comply with changing laws.
Traditionally, these calculations have been implemented using custom code, and as a result, suffer from the challenges outlined above, including the high cost of development and maintenance, requirements being lost in translation, the lack of traceability, and the lack of a robust mechanism for making a quick change to a calculation in response to a change in a standard. This is where the power of Excel on Server comes in.
As you may know, Excel on the server is available via in two forms: Read More…