During the second day of Intel Innovation 2023, Intel’s CTO Greg Lavender explained how their open ecosystem philosophy makes AI accessible to all developers.
Developers interested in utilising AI face several challenges that hinder the widespread adoption of solutions that extend from the edge and client to the data centre and cloud. Intel is dedicated to addressing these challenges by providing a comprehensive software-defined, silicon-accelerated approach grounded in openness, choice, trust, and security. By offering tools that simplify the development of secure AI applications and reduce the investment required to maintain and scale those solutions, Intel is enabling developers to integrate AI into various environments.
“The developer community is the catalyst helping industries leverage AI to meet their diverse needs – both today and into the future,” Lavender said. “AI can and should be accessible to everyone to deploy responsibly. If developers are limited in their choice of hardware and software, the range of use cases for global-scale AI adoption will be constrained and likely limited in the societal value they are capable of delivering.”
Intel says it is committed to end-to-end security and uses Intel Transparent Supply Chain to verify hardware and firmware integrity. Intel has introduced new tools and services to expand its platform security and data integrity protection. The new attestation service, part of Intel Trust Authority, offers an independent assessment of trusted execution environment integrity, policy enforcement, and audit records. It can be used anywhere Intel confidential computing is deployed. Intel Trust Authority enables confidential AI and ensures the trustworthiness of confidential computing environments, especially in inferencing current and future generations of Intel Xeon processors.
Lavender explained that Intel’s AI software strategy is based on open ecosystems and open accelerated computing to deliver AI everywhere. The aim is to create a level playing field for AI developers and scale innovation opportunities.
The oneAPI programming model enables code to be deployed across CPUs, GPUs, FPGAs, and accelerators. Intel collaborates with Red Hat, Canonical, and SUSE to optimise enterprise software releases. Intel Granulate is adding Auto Pilot for Kubernetes pod resource rightsizing to help developers scale performance. Intel plans to develop an ASIC accelerator for Fully Homomorphic Encryption. It will launch a beta version of an encrypted computing software toolkit as part of the Developer Cloud.
Developers interested in utilising AI face several challenges that hinder the widespread adoption of solutions that extend from the edge and client to the data centre and cloud. Intel is dedicated to addressing these challenges by providing a comprehensive software-defined, silicon-accelerated approach grounded in openness, choice, trust, and security. By offering tools that simplify the development of secure AI applications and reduce the investment required to maintain and scale those solutions, Intel is enabling developers to integrate AI into various environments.
“The developer community is the catalyst helping industries leverage AI to meet their diverse needs – both today and into the future,” Lavender said. “AI can and should be accessible to everyone to deploy responsibly. If developers are limited in their choice of hardware and software, the range of use cases for global-scale AI adoption will be constrained and likely limited in the societal value they are capable of delivering.”
Intel says it is committed to end-to-end security and uses Intel Transparent Supply Chain to verify hardware and firmware integrity. Intel has introduced new tools and services to expand its platform security and data integrity protection. The new attestation service, part of Intel Trust Authority, offers an independent assessment of trusted execution environment integrity, policy enforcement, and audit records. It can be used anywhere Intel confidential computing is deployed. Intel Trust Authority enables confidential AI and ensures the trustworthiness of confidential computing environments, especially in inferencing current and future generations of Intel Xeon processors.
Lavender explained that Intel’s AI software strategy is based on open ecosystems and open accelerated computing to deliver AI everywhere. The aim is to create a level playing field for AI developers and scale innovation opportunities.
The oneAPI programming model enables code to be deployed across CPUs, GPUs, FPGAs, and accelerators. Intel collaborates with Red Hat, Canonical, and SUSE to optimise enterprise software releases. Intel Granulate is adding Auto Pilot for Kubernetes pod resource rightsizing to help developers scale performance. Intel plans to develop an ASIC accelerator for Fully Homomorphic Encryption. It will launch a beta version of an encrypted computing software toolkit as part of the Developer Cloud.