Azure Data Explorer gets new engine, numerous enhancements and Synapse integration

3.png

An Azure Data Explorer dashboard


Credit: Microsoft

Today, at a dedicated online event for Azure Data Explorer (ADX), Microsoft is announcing numerous enhancements to the service, including a next-gen release of the underlying engine and an array of integration points that should make it more accessible, more enticing and more useful. ADX, which is most often used for telemetry data lake workloads and analytical solutions as a service, will now run even faster overall than it had, will have numerous optimizations and will connect with a variety of other data services, streaming data sources and data visualization solutions. This will help a service that’s been very successful but not especially well-known, even among Azure analytics experts, achieve more mainstream appeal.

Performance gains galore

What new features are coming to ADX? To start with, Microsoft’s introducing a new version of the core engine (in preview, with GA expected in February), which takes a wholly different strategy to querying data. The Kusto v3 engine will generate multiple versions of the desired query, use the fastest, and compile it to native code before executing, so that it runs at maximum speed. The indexing layer in the v3 engine has also been rewritten. As a result of these changes, Microsoft says queries will run between 2x and 30x faster.

And beyond this raw performance gain, ADX will now offer self-refreshing materialized views, query result set caching and configurable sharding/partitioning. Near real time scoring with machine learning models — including those hosted on Azure Machine Learning as well as those from other platforms, packaged in ONNX format — is being added as well. Fast Fourier Transforms, geospatial joins and polynomial regression are onboarding too. ADX is also getting row-level security capabilities that will make it more appealing to customers who want to support a wide

WISeKey Secures Ensurity’s Passwordless Access to Microsoft Windows and Azure AD

WISeKey Secures Ensurity’s Passwordless Access to Microsoft Windows and Azure AD

While the popularity of remote working is rising, organizations are struggling to maintain a consistent digital security to protect their sensitive data in transit. Ensurity has selected WISeKey’s certified cybersecurity solutions to secure digital credentials of geographically distributed workforces.

Watch the WISeKey x Ensurity Times Square Nasdaq tower campaign

Geneva, Switzerland – Hyderabad, India – October 8, 2020: WISeKey International Holding Ltd. (“WISeKey”) (SIX: WIHN, NASDAQ: WKEY), a leading global cybersecurity and IoT company, announced today that Ensurity Technologies (“Ensurity”), a Hyderabad, India-based cybersecurity company and a member of Microsoft Intelligent Security Association (MISA), has selected WISeKey’s Common Criteria certified secure microprocessors to design its ThinC-AUTH FIDO2 certified biometric key to access Microsoft® Windows® and Azure® AD.

For more than two decades, WISeKey has been one of the very few recognized providers of hardware and software, first-in-class digital security solutions for cybercrime protection, people and object identification and authentication. WISeKey has designed secure chips, which are widely used in highly sensitive applications such as banking, national ID and PayTV.  Most of these chips are designed for and certified with  Common Criteria EAL5+, one of the highest government grade security certifications.

Ensurity has been partnering with Microsoft through the Microsoft Intelligent Security Association (MISA), an elite group of cybersecurity technology companies, to support the FIDO2 passwordless initiative and offer a secure login to Microsoft Windows and Azure Active Directory (AD), a solution to remotely access corporate data and applications. ThinC-AUTH is a FIDO2 certified and Microsoft compatible USB based security key featuring a fingerprint touch sensor. The key is designed around WISeKey’s secure chip with a government grade digital security certification to guarantee that credentials, cryptography methods and user’s fingerprints are stored in a safe place.

“Living

UCSF, Fortanix, Intel, and Microsoft Azure Utilize Privacy-Preserving Analytics to Accelerate AI in Healthcare

UC San Francisco’s Center for Digital Health Innovation (CDHI), Fortanix, Intel, and Microsoft Azure today have formed a collaboration to establish a confidential computing platform with privacy-preserving analytics to accelerate the development and validation of clinical algorithms.

The platform will provide a “zero-trust” environment to protect both the intellectual property of an algorithm and the privacy of healthcare data, while CDHI’s proprietary BeeKeeperAI will provide the workflows to enable more efficient data access, transformation, and orchestration across multiple data providers.

Gaining regulatory approval for clinical artificial intelligence (AI) algorithms requires highly diverse and detailed clinical data to develop, optimize, and validate unbiased algorithm models. Algorithms that are used in the context of delivering healthcare must be capable of consistently performing across diverse patient populations, socioeconomic groups, geographic locations, and be equipment agnostic. Few research groups, or even large healthcare organizations, have access to enough high-quality data to accomplish these goals.

“While we have been very successful in creating clinical-grade AI algorithms that can safely operate at the point of care, such as immediately identifying life-threatening conditions on X-rays, the work was time consuming and expensive,” said Michael Blum, MD, associate vice chancellor for informatics, executive director of CDHI and professor of medicine at UCSF. “Much of the cost and expense was driven by the data acquisition, preparation, and annotation activities. With this new technology, we expect to markedly reduce the time and cost, while also addressing data security concerns.”

The organizations will leverage the confidential computing capabilities of Fortanix Confidential Computing Enclave Manager, Intel’s Software Guard Extensions (SGX) hardware-based security capabilities, Microsoft Azure’s confidential computing infrastructure, and UCSF’s BeeKeeperAI privacy preserving analytics to calibrate a proven clinical algorithm against a simulated data set. A clinical-grade algorithm that rapidly identifies those needing blood transfusion in the Emergency Department following trauma will

Microsoft’s Azure AD authentication outage: What went wrong

azureadoutage.jpg

Credit: Microsoft

On September 28 and September 29 this week, a number of Microsoft customers worldwide were impacted by a cascading series of problems resulting in many being unable to access their Microsoft apps and services. On October 1, Microsoft posted its post-mortem about the outages, outlining what happened and next steps it plans to take to head this kind of issue off in the future.

Starting around 5:30 p.m. ET on Monday, September 28, customers began reporting they couldn’t sign into Microsoft and third-party applications which used Azure Active Directory (Azure AD) for authentication. (Yes, this means Office 365 and other Microsoft cloud services.) Those who were already signed in were less likely to have had issues. According to Microsoft’s report, users in the Americas and Australia were likely to be impacted more than those in Europe and Asia.

Microsoft acknowledged it was a service update targeting an internal validation test ring that caused a crash in Azure AD backend services. “A latent code defect in the Azure AD backend service Safe Deployment Process (SDP) system caused this to deploy directly into our production environment, by passing our normal validation process,” officials said.

Azure AD is designed to be geo-distributed and deployed with multiple partitions across multiple data centers around the world, and is built with isolation boundaries. Microsoft normally applies changes across a validation ring that doesn’t include customer data, followed by four additional rings over the course of several days before they hit production. But this week the SDP didn’t correctly target the validation ring due to a defect and all rings were targeted concurrently causing service availability to degrade, Microsoft’s report says.

Microsoft engineering knew within five minutes of the problem that something was wrong. During the next 30 minutes, Microsoft started taking steps to expedite