Microsoft Ignite: Day 3

Ignite Floors

Read my summary of all Ignite days:

 

Welcome to the third installment of my Microsoft Ignite review!

And today, surprisingly, we’ll discuss the 3rd day of Ignite.

So these are the sessions I’ve been in, and what I think of them:

Machine Learning Simplified (Erez Barak, Microsoft)

I was kind of expecting this session. As a developer and an architect, I didn’t really have chance – or reasons – to go into the Machine Learning space, and I knew I’ll have to do it someday. I was hoping this session will provide a great introduction to this topic, and will give some background I can use in more advanced tasks.

And boy did it provide!

Microsoft is positioning its ML efforts for all skill levels, and tries to make it industry leading, open and trusted.

There are various tools available for ML work, and they are targeted to different types of users.

We discussed the four personas the tools are targeted for:

Personas

As you can see, these personas are differentiated by their coding level and ML level. For each type there is its tool:

Daniel, who is an ML expert and a python developer, can use Azure ML Notebooks to use Python algorithms on her models.

Samantha, who is also quite proficient with ML but has no coding experience, will use the ML Designer to drag-and-drop the activities and entities she would like to act upon, to build the flow of the model.

Dean, who has no ML knowledge as well as development experience, will use AutoML to load the data and configure the required predictions on it.

And John, a developer with no ML experience, will use Pipelines.

On stage was also Anita Klopfenstein, Little Caesars’s CIO, who talked about the way they’re using Azure ML to predict the type of pizza to prepare.

My take – Azure has a very impressive ML suite, but they’re not for everyone.

I was most impressed by the AutoML feature. You literally just upload a bunch of data, tells azure which column contains the data to be predicted, and voila – you’ve got yourself an ML machine. Really impressive, and ridiculously easy to use.

There are some more future plans for Azure ML, and it looks like Azure is positioning itself as the top contender in this field.

 

Creating Amazing Web Apps with .NET Core (Daniel Roth, Microsoft)

My inner geek couldn’t avoid this session, and even though it was very code-oriented, I just had to see it.

So after a brief introduction of .NET Core, we delved into the new features of .NET Core 3.0

First – the performance have improved substantially.

.NET Core Performance

As you can see, compared to .NET Core 2.2, the performance gains are huge.

Next we talked about Blazor.

Blazor is a very interesting concept, allowing the developer to develop web front end using C#. So instead of JavaScript, which not all backend developers are proficient with, the developer can use language and runtime she’s already familiar with.

Now, this might bring you the chills, remembering ActiveX and Java applets, but this time it’s different.

Blazor can be run in two modes:

WebAssembly, which is a widely adopted standard for executing external runtimes in the browser,

and Blazor Server, which actually perfomes all the logic and UI rendering on the server, and just notifies the browser (using SignalR) about the UI changes to be done.

There was a short demo of Blazor, and it left me thinking whether the web UI world is going to a new direction.

TBH, I don’t see it gains a lot of traction. The current UI frameworks are quite mature, and developers will need a really good reason to leave them behind. And frankly, I just don’t see these reasons in Blazor. Really cool, though…

Next topic was gRPC. gRPC is a contract-based communication protocol, allowing services to communicate blazingly fast with each other quickly and easily. gRPC uses protobuf as the schema generator for the messages, and, as its name implies, is an RPC-based protocol, as opposed to REST, which is basically an entity-based protocol.

ASP.NET Core 3.0 has built-in template for gRPC service, and is quite easy to implement.

Now, I’m still on the fence regarding the use of gRPC. True, its performance are great, but the fact that supporting libraries are required on the server and the client, as opposed to JSON which requires nothing, is a let down for me.

Lastly, SignalR.

Some major improvements in the new version, the most important – Auto Reconnect and streaming support.

I, personally, love SignalR, and the addition of Auto Reconnect makes it even more attractive. I’ll definitely include it in my future architectures.

 

Connect Workforce and Apps to Azure AD (Agnieszka Girling & Joseph Martinez, Microsoft)

An overview of the features in Azure AD that helps admins (and architects, to some extent) to connect various apps to Azure AD, thus providing Sign-On & SSO Services, Users and Groups provisioning, and Authorization Services.

The session was packed with demos, and it showed how easy it is to connect existing web apps to the Azure AD authentication services, without writing a single line of code.

Another demos showed granular authorization, and the new Workspace, that groups app by various parameters to make them more accessible.

 

Developer Guide to Cosmos DB (Deborah Chen, Microsoft)

Another session I was expecting. I do have experience with Cosmos, the globally distributed, NoSQL, multi-API database, but I know I have a lot to learn.

The session was built around a logistics scenario, with various use cases (tracking delivery vehicles, fleet management).

Here is a simplified architecture of the solution:

Cosmos Architecture

As you can see, Cosmos is part of the solution that contains also IoT hub, function and web app.

Now, one of the most important things to decide on, is which API to on Cosmos, and the recommendation is to always try to use the SQL (Core) API. The other APIs are mainly for migration scenarios.

Now, I won’t walk you through all the technical details of working with Cosmos, you can see the recorded session for that. I do want to emphasize some points about using Cosmos:

  • Cosmos is very price sensitive. Use the Cosmos emulator to get the charge per get and writes, and define the correct RU (Requests Units), which are the pricing units for Cosmos. There is also an RU calculator in Azure you can use for that.
  • Decide between manual provisioning and AutoPilot. AutoPilot is great for unpredictable loads, and will auto-scale the number of RUs. Manual is, well, manual, and is better for predictable and consistent loads.
  • Think long and hard about partitioning. It has great effect on performance.

And some quick tips:

  • Make sure the Cosmos DB and the container are in the same region.
  • Use point read (get a specific key) instead of search when possible.
  • Tune the MaxConcurrency and MaxItemCount in the client library.
  • To get the highest SLA possible (99.999%), use 2 regions.

All in all, a great session, that delivered exactly what I was looking for. It looks like Microsoft is heavily investing in Cosmos, and it is definitely an alternative every cloud architect should look into.

 

Envisioning Tomorrow Keynote (Mitra Azizirad, Microsoft)

Craziest session so far – by far. Demonstrations of things to come, and a lot of food for thought.

Storing data in glass, talking to plants, glowing clothes, programming genes – a lot of interesting stuff that happens in Microsoft Research.

Sessions like that make me think that while in other companies the research division works for the rest of the company, in Microsoft it’s the other way around – the other divisions work hard to fund the research, and to allow it to invent some crazy things that one day, maybe, will improve our life.

 

So that was day #3 of ignite. A lot of fun, a lot of learning, a lot of networking – good day.

See you tomorrow!

 

– – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – – –

Don’t forget to check out my courses:

The Complete Guide to Becoming a Software Architect

Rest API Design – The Complete Guide

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.