Skip to content

Commit

Permalink
comment fix (openvinotoolkit#22557)
Browse files Browse the repository at this point in the history
  • Loading branch information
msmykx-intel authored Jan 31, 2024
1 parent ffb5270 commit 4cf6f57
Show file tree
Hide file tree
Showing 2 changed files with 58 additions and 69 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ Installation & Deployment


.. meta::
:description: OpenVINO™ API 2.0 focuses on the use of development tools and
deployment of applications, it also simplifies migration from
:description: OpenVINO™ API 2.0 focuses on the use of development tools and
deployment of applications, it also simplifies migration from
different frameworks to OpenVINO.


Expand All @@ -18,7 +18,7 @@ One of the main concepts for OpenVINO™ API 2.0 is being "easy to use", which i
* Development and deployment of OpenVINO-based applications.


To accomplish that, the 2022.1 release OpenVINO introduced significant changes to the installation
To accomplish that, the 2022.1 release OpenVINO introduced significant changes to the installation
and deployment processes. Further changes were implemented in 2023.1, aiming at making the installation
process even simpler.

Expand All @@ -27,8 +27,8 @@ process even simpler.
These instructions are largely deprecated and should be used for versions prior to 2023.1.

The OpenVINO Development Tools package is being deprecated and will be discontinued entirely in 2025.
With this change, the OpenVINO Runtime package has become the default choice for installing the
software. It now includes all components necessary to utilize OpenVINO's functionality.
With this change, the OpenVINO Runtime package has become the default choice for installing the
software. It now includes all components necessary to utilize OpenVINO's functionality.



Expand Down Expand Up @@ -68,7 +68,7 @@ In OpenVINO 2022.1 and later, you can install the development tools only from a

.. code-block:: sh
$ python3 -m pip install -r <INSTALL_DIR>/tools/requirements_tf.txt
$ python3 -m pip install -r <INSTALL_DIR>/tools/requirements_tf.txt
This will install all the development tools and additional components necessary to work with TensorFlow via the ``openvino-dev`` package (see **Step 4. Install the Package** on the `PyPI page <https://pypi.org/project/openvino-dev/>`__ for parameters of other frameworks).
Expand All @@ -81,7 +81,7 @@ Then, the tools can be used by commands like:
$ pot -h
Installation of any other dependencies is not required. For more details on the installation steps, see the
Installation of any other dependencies is not required. For more details on the installation steps, see the
`Install OpenVINO Development Tools <https://docs.openvino.ai/2023.3/openvino_docs_install_guides_install_dev_tools.html>`__ prior to OpenVINO 2023.1.

Interface Changes for Building C/C++ Applications
Expand Down Expand Up @@ -127,36 +127,36 @@ It is possible to build applications without the CMake interface by using: MSVC

.. tab-item:: Include dirs
:sync: include-dirs

.. code-block:: sh
<INSTALL_DIR>/deployment_tools/inference_engine/include
<INSTALL_DIR>/deployment_tools/ngraph/include
.. tab-item:: Path to libs
:sync: path-libs

.. code-block:: sh
<INSTALL_DIR>/deployment_tools/inference_engine/lib/intel64/Release
<INSTALL_DIR>/deployment_tools/ngraph/lib/
.. tab-item:: Shared libs
:sync: shared-libs

.. code-block:: sh
// UNIX systems
inference_engine.so ngraph.so
// Windows
inference_engine.dll ngraph.dll
.. tab-item:: (Windows) .lib files
:sync: windows-lib-files

.. code-block:: sh
ngraph.lib
inference_engine.lib
Expand All @@ -166,36 +166,36 @@ It is possible to build applications without the CMake interface by using: MSVC

.. tab-item:: Include dirs
:sync: include-dirs

.. code-block:: sh
<INSTALL_DIR>/runtime/include
.. tab-item:: Path to libs
:sync: path-libs

.. code-block:: sh
<INSTALL_DIR>/runtime/lib/intel64/Release
.. tab-item:: Shared libs
:sync: shared-libs

.. code-block:: sh
// UNIX systems
openvino.so
// Windows
openvino.dll
.. tab-item:: (Windows) .lib files
:sync: windows-lib-files

.. code-block:: sh
openvino.lib
Clearer Library Structure for Deployment
########################################
Expand All @@ -219,31 +219,30 @@ Below are detailed comparisons of the library structure between OpenVINO 2022.1
* ``openvino_onnx_frontend`` is used to read ONNX models instead of ``inference_engine_onnx_reader`` (with its dependencies).
* ``openvino_paddle_frontend`` is added in 2022.1 to read PaddlePaddle models.

<!-----
Older versions of OpenVINO had several core libraries and plugin modules:
- Core: ``inference_engine``, ``ngraph``, ``inference_engine_transformations``, ``inference_engine_lp_transformations``
- Optional ``inference_engine_preproc`` preprocessing library (if ``InferenceEngine::PreProcessInfo::setColorFormat`` or ``InferenceEngine::PreProcessInfo::setResizeAlgorithm`` are used)
- Plugin libraries:
- ``MKLDNNPlugin`` for :doc:`CPU <openvino_docs_OV_UG_supported_plugins_CPU>` device
- ``clDNNPlugin`` for :doc:`GPU <openvino_docs_OV_UG_supported_plugins_GPU>` device
- ``MultiDevicePlugin`` for :doc:`Multi-device execution <openvino_docs_OV_UG_Running_on_multiple_devices>`
- others
- Plugins to read and convert a model:
- ``inference_engine_ir_reader`` to read OpenVINO IR
- ``inference_engine_onnx_reader`` (with its dependencies) to read ONNX models
Now, the modularity is more clear:
- A single core library with all the functionality ``openvino`` for C++ runtime
- ``openvino_c`` with Inference Engine API C interface
- **Deprecated** Optional ``openvino_gapi_preproc`` preprocessing library (if ``InferenceEngine::PreProcessInfo::setColorFormat`` or ``InferenceEngine::PreProcessInfo::setResizeAlgorithm`` are used)
- Use :doc:`preprocessing capabilities of OpenVINO API 2.0 <openvino_2_0_preprocessing>`
- Plugin libraries with clear names:
- ``openvino_intel_cpu_plugin``
- ``openvino_intel_gpu_plugin``
- ``openvino_auto_plugin``
- others
- Plugins to read and convert models:
- ``openvino_ir_frontend`` to read OpenVINO IR
- ``openvino_onnx_frontend`` to read ONNX models
- ``openvino_paddle_frontend`` to read Paddle models
---->

.. Older versions of OpenVINO had several core libraries and plugin modules:
.. - Core: ``inference_engine``, ``ngraph``, ``inference_engine_transformations``, ``inference_engine_lp_transformations``
.. - Optional ``inference_engine_preproc`` preprocessing library (if ``InferenceEngine::PreProcessInfo::setColorFormat`` or ``InferenceEngine::PreProcessInfo::setResizeAlgorithm`` are used)
.. - Plugin libraries:
.. - ``MKLDNNPlugin`` for :doc:`CPU <openvino_docs_OV_UG_supported_plugins_CPU>` device
.. - ``clDNNPlugin`` for :doc:`GPU <openvino_docs_OV_UG_supported_plugins_GPU>` device
.. - ``MultiDevicePlugin`` for :doc:`Multi-device execution <openvino_docs_OV_UG_Running_on_multiple_devices>`
.. - others
.. - Plugins to read and convert a model:
.. - ``inference_engine_ir_reader`` to read OpenVINO IR
.. - ``inference_engine_onnx_reader`` (with its dependencies) to read ONNX models
.. Now, the modularity is more clear:
.. - A single core library with all the functionality ``openvino`` for C++ runtime
.. - ``openvino_c`` with Inference Engine API C interface
.. - **Deprecated** Optional ``openvino_gapi_preproc`` preprocessing library (if ``InferenceEngine::PreProcessInfo::setColorFormat`` or ``InferenceEngine::PreProcessInfo::setResizeAlgorithm`` are used)
.. - Use :doc:`preprocessing capabilities of OpenVINO API 2.0 <openvino_2_0_preprocessing>`
.. - Plugin libraries with clear names:
.. - ``openvino_intel_cpu_plugin``
.. - ``openvino_intel_gpu_plugin``
.. - ``openvino_auto_plugin``
.. - others
.. - Plugins to read and convert models:
.. - ``openvino_ir_frontend`` to read OpenVINO IR
.. - ``openvino_onnx_frontend`` to read ONNX models
.. - ``openvino_paddle_frontend`` to read Paddle models
Original file line number Diff line number Diff line change
Expand Up @@ -238,13 +238,3 @@ Additional Resources
* Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model zoo>`
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit <https://github.com/intel-iot-devkit>`__

<!---
To learn more about converting models from specific frameworks, go to:
* :ref:`Convert Your Caffe Model <convert model caffe>`
* :ref:`Convert Your TensorFlow Model <convert model tf>`
* :ref:`Convert Your TensorFlow Lite Model <convert model tfl>`
* :ref:`Convert Your Apache MXNet Model <convert model mxnet>`
* :ref:`Convert Your Kaldi Model <convert model kaldi>`
* :ref:`Convert Your ONNX Model <convert model onnx>`
--->

0 comments on commit 4cf6f57

Please sign in to comment.