ESPC19 Day 2

The important thing first. This is an live Blog Post, so during the writing and post, there are maybe some typos or grammatical issues. Days after the conference, I will correct this.

Okay lets start with day two. Today I had a full filled agenda. The KeyNote start was a little bit different 🙂 Here is a short Video (nice fire)


Session Name: KeyNote (The Intelligent Workplace with Microsoft 365)

Speaker: Jeff Teper,

Level:

Officially Agenda:

Microsoft 365 empowers individuals, teams and organizations to be creative, collaborative, and effective with an integrated suite of experiences that are simple, superior, smart and secure. Explore the latest innovations and solutions for content collaboration, teamwork, process transformation, employee engagement & communications, and knowledge sharing & discovery. Learn how the experiences in Microsoft 365–including SharePoint, OneDrive, Yammer, Stream, PowerApps, Flow–integrate to deliver a secure, compliant, intelligent workplace across devices, on the web, in desktop and mobile apps, and in the hub for teamwork, Microsoft Teams.

Here are some pictures about the KeyNote:

A big section at the KeyNote was the Project Cortex. The big thing here is Organisation relationships. I’m not really the big microsoft 365 consultant, but that project is a huge opportunity for each company. As European I’m not really sure, if that project and that possibilities works together with the GDBR? Project Cortex is available H1 2020

The next part is intelligent intranet. How the internet of the future looks like. A big part is also the compatibility with mobile devices.

The intranet from the future will be able to show depending on your company role web parts who are relevant for you. Also a new yammer webpart ist in place. Microsoft Search is also a big news (on ignite announced) and we saw many possibilities. There is also a completly rebuild of Yammer.Yammer Groups are renamed to Yammer Communities.

I’m not really a SharePoint consultant, but with that new features I’m really impressed and looking forward for hopeful nice intranet configurations.

Security and Compliance is also a big thing. There ia a big projct in combination with BP. In that session we saw the new Azure Information Protection capabilitien and integration to Sharepoint. The Automation Labeling functionality (P2 Functionality) will be showed.

The new feature information barriers is also showed. With that functionality you can create a block collaboration policy.

The key information about security: “Microsoft will be the best place to store data” For my point of view a big statement 🙂

The new Microsoft 365 Migration Manager will also be shown.


Session Name: KeyNote (Winning with Azure)

Speaker: Tejas Dixit

Level:

Officially Agenda:

Microsoft Azure is an ever-expanding set of cloud services to help your organization meet your business challenges. It’s the freedom to build, manage, and deploy applications on a massive, global network using your favorite tools and frameworks. Explore the latest innovations and solutions Azure is enabling for the customers and partners. Learn about how Azure makes it easy to migrate on premises apps or modernize existing applications or build net new cloud native apps.

Okay the second KeyNote focused in Azure (my part) 🙂

Dixit talk a lot about Digital transformation. References are for example

WalMart, Ebiya, Paccar

Paccar work for example with HoloLens in Automotive area.The Capabilities are:

  • Be future ready
  • Build your terms
  • Operate hybrid seamlessly
  • Trust your cloud

Be feature ready: There are 1000+ new capabilities in the last year. From IaaS perspective the new capabilities are:

From PaaS perspective the new capabilities are:

The primary focus is not a lift an shift migration. Instead this the focus should be application modernization. The hugest growing in the PaaS section is AKS

From Data perspective the new capabilities are:

From Analytics perspective the new capabilities are:

A big point here is definitely Azure Synapse. I Blog about this at one of my last Posts (MICROSOFT IGNITE 2019 TAG 1)

From AI and IoT perspective the new capabilities are:

He also talk about Uber and how the use AI and IoT for their business. Uber build their solution in three month. SO the Time to market was awesome.

Build on your terms. The focus here is definitely focused onto the developer community. Microsoft commit more and more to the Open-Source community to gave the developer the opportunity to develop on Microsoft or open source capabilities. That’s for my point of view the right way and a real cool solution for each company and developer.

It’s really impressive to see, that 50,7% of developers use the Open-Source Tool “Visual Studio Code” who is absolutely free.

There is also a Public Preview Announcement “Visual Studio Online”

Operate hybrid seamlessly: Hybrid is a fundamentals thing for Microsoft and the investement is really huge. Each customer has to ask herself “Why Public Cloud” and maybe “Why not Hybrid Cloud” That question is a key point and Microsoft works really hard to develop a seamless Hybrid environment:

A big point here is Azure Stack Hub, Azure Stack Edge and the new announced feature Azure ARC (write about in one of my last Posts)

Trust your Cloud: It’s a big point for each customer, because when a customer goes to the cloud, their data are located in one of Microsoft Datacenters. Microsoft have an huge investment into security and also build Best Practice (Cloud Adoption Framework) to get to the Cloud and also build a Azure Migration Program.


Session Name: Azure Governance Using Azure Blueprints

Speaker: Stephane Lapointe, MVP

Level: 300

Offiziel Agenda:

There are great benefits for standards and processes for IT, whatever the organisation size. The cloud has great promises: agility, flexibility and elasticity just to name a few, but the democratisation of IT tasks in a cloud world can represent big challenges & risks. Take control of your Azure environments and resources while giving enough freedom to your developers to focus on their goals—not on compliance-related tasks. Learn how governance tools at your disposal will help you maintain this balance to suit your organisation needs.

In this session, discover how Azure Blueprints, Role Base Access Control (RBAC), Policies & Azure Resource Manager (ARM) allows you to obtain this balance between control and agility. Learn how to streamline environment creation, enable compliant development and protect key resources.

Benefits of Attending this Session: 

  1. Learn how to transpose your organisation’s governance requirements on Azure
  2. Learn about re-usability & best practices using ARM & Azure Blueprints
  3. Learn how to streamline environment creation, enable compliant development and protect key resources

The pinpoints in that sessions are:

Why Governance? Governance tools for governance are:

In that session the topic “BluePrint” was the main focus. The traditional approach is changing. The modern approach looks like the following:

The first step for governance is always tagging. We have to define a tagging strategy and enforce those with policies!

In that session I don’t here any new features but it’s great to hear, that my Azure Governance Workshop includes the same topics. So if you have any questions about Azure governance and wan’t more details please let me know. Write me in Twitter or LinkedIn.

Here are some slides from the session:

I rich powerful mechanism of blueprint who is only implement in that service is the lock/deny service. You can create a blueprint and define a read-only lockdown. If you want to unassigned a Blueprint lock, the only way to archive that is, to unassigned the Blueprint from the Subscription.

The only way to edit or modify Blueprints is to have the Owner or Blueprint Contributor permission.

Here is an overview or the idea behind Azure BluePrint.

Azure Blueprint is since 1 year in preview. The only reason for that is, the the feature “when you unassign the blueprint” the functionality to “remove” isn’t GA.

All the rest is in GA and you can use the BluePrint functionality for production environments. You have a HelpDesk support and so on.


Session Name: Code your Azure Deployments using Terraform

Speaker: Thorsten Hans, MVP

Level: 200

Officially Agenda:

Using HashiCorp’s Terraform you can code almost every Azure Deployment. Having your entire infrastructure as source code makes it easy to spin up new environments in a couple of minutes. Terraform takes this approach a step further, having a single language and a single CLI allows you to describe deployments for almost every cloud vendor. With Terraform Execution Plans you can also inspect what will be deployed, changed or deleted in Azure without harming existing deployments. Join this talk by Thorsten Hans and learn how to write your deployments and quickly create different, independent environments such as Testing, Staging, and Production in Azure.

It’s my first Terraform session and I’m really curious about the contend. Terraform have a big benefit.

Preview Changes befor deploy!

We start with Terraform-write:

Next Step is Terraform – Plan

The last step is Terraform – Create

The terraform lifecycle:


Hashicorp implement a seperate language “HCL HashiCorp Configuration Language”, but no worries,the Language is based on JSON, so not new language if you write ARM template. The speaker add the whole content to GitHub and you can find it on https://github.com/ThorstenHans/espc2019-terraform

Hans split the ressources or code into seperate sections for example:

  • main.tf
  • meta.tf
  • outputs.tf
  • variables.tf

This make really sense because when you project grows you can create a perfect structure. There are also really interesting functionalities to bring tags to each deployed ressources.

One important message, Terraform use the same api as azure, so the intelligence was really awesome.

One important point! In Terraform you don’t have to define dependencies. Terraform tries to deploy the ressources pararells as possible but know for example: an appservice needs an appserviceplan.

You can define dependencies if you want, but for that use case it’s not needing!

A cool Terraform reference could be found here

Okay my conclusion: Terraform is a really cool tool for automated deployment. I think, I have to investigate a bit time to understand and create demo for a Multi Cloud deployment 🙂


Session Name: How to Build Mission Critical, Globally Distributed Applications with Azure Cosmos DB

Speaker: Mark Brown

Level: 400

Officially Agenda:

Developing highly-available, globally distributed applications in the cloud that respond with extreme low latency while maintaining consistent views of data worldwide is a challenging problem. In this session, we explore the PACELC theorem and demonstrate the relationship between availability, latency and consistency in a distributed environment. How to design for high-availability in Azure, and how to translate RPO and RTO to drive the design choices that must be made when building global applications in the cloud. If you build distributed applications for the cloud or advise customers how to do it, this session is for you. Packed with real-world scenarios and demos that you can take and use yourself.

Okay the topic of that session was really new for my, because at the moment the only thing who I’ve done with Cosmos DB is, to create a reserved instance 🙂 Important part for Azure ComosDB, that database is a fully managed database by Microsoft, so you don’t have to think about Memory, CPU, IO and so on.

The big benefit about Comos DB is to build distributed systems with Consitency, low latency and high availability.

Consitency by Microsoft meaning looks like the following:

Latency:

We saw many demos in that session. The read latency for single an multi regions.

If you use multi region, the read latency was traumatically lower.

The next point was the write latency. With ComosDB you can archive a multi write latency looks like the follwoing:

Consistency:

There are different consitency models:

Here is an sample application using Azure Cosmos DB including TrafficManager and WebTier applicatiuons.

Okay a great session, but for my a little bit to deep into Data Architecture. So at the moment not really a task for my future investment. Other Architects should also have a work 🙂


Finally day 2 was finished, it was a really hard day, with many learning parts. Hopefully tomorrow will also have new learning for me.