# Working with NPU in OpenVINO™ To learn more about NPU in OpenVINO, refer to the [NPU Device](https://docs.openvino.ai/2024/openvino-workflow/running-inference/inference-devices-and-modes/npu-device.html) section in the docs. ## Notebook Contents This tutorial provides a high-level overview of working with the NPU device **Intel(R) AI Boost** (introduced with the Intel® Core™ Ultra generation of CPUs) in OpenVINO. It explains some of the key properties of the NPU and shows how to compile a model on NPU with performance hints. This tutorial also shows example commands for benchmark_app that can be run to compare NPU performance with CPU in different configurations. Note that you need to [install a proper NPU driver](https://docs.openvino.ai/2024/get-started/configurations/configurations-intel-npu.html) to use it successfully. ## Installation Instructions If you have not installed all required dependencies, follow the [Installation Guide](../../README.md).