Skip to content

Commit a187839

Browse files
JennaPaikowskyPipKatPProfizi
authored
Documentation edits (#1996)
* Edits part 1 * Edits part 2 * Apply suggestions from code review Co-authored-by: Kathy Pippert <[email protected]> * Apply suggestions from code review Co-authored-by: Kathy Pippert <[email protected]> * Apply suggestions from code review Co-authored-by: Kathy Pippert <[email protected]> * Apply suggestions from code review Co-authored-by: Kathy Pippert <[email protected]> * Changes from code review * Review changes * Apply suggestions from code review Co-authored-by: Paul Profizi <[email protected]> --------- Co-authored-by: Kathy Pippert <[email protected]> Co-authored-by: Paul Profizi <[email protected]>
1 parent 3883355 commit a187839

18 files changed

+137
-144
lines changed

doc/source/_static/simple_example.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Here's how you would open a result file generated by Ansys MAPDL (or another Ansys solver) and
1+
The following example shows how to open a result file generated by Ansys MAPDL (or another Ansys solver) and
22
extract results:
33

44
.. code-block:: default

doc/source/getting_started/compatibility.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Operating system
88
----------------
99

1010
DPF supports Windows 10 and Rocky Linux 8 and later.
11-
To run DPF on CentOS 7, use DPF for 2024R2 (8.2) or older.
11+
To run DPF on CentOS 7, use DPF for 2024 R2 (8.2) or later.
1212
For more information, see `Ansys Platform Support <https://www.ansys.com/solutions/solutions-by-role/it-professionals/platform-support>`_.
1313

1414
Client-server
@@ -23,8 +23,8 @@ version.
2323

2424
As new features are developed, every attempt is made to ensure backward
2525
compatibility from the client to the server. Backward compatibility is generally ensured for
26-
the 4 latest Ansys versions. For example, ``ansys-dpf-core`` module with 0.8.0 version has been
27-
developed for Ansys 2023 R2 pre1 release, for 2023 R2 Ansys version. It is compatible with
26+
the four latest Ansys versions. For example, the ``ansys-dpf-core`` module 0.8.0 has been
27+
developed for the Ansys 2023 R2 version. It is compatible with
2828
2023 R2, 2023 R1, 2022 R2 and 2022 R1 Ansys versions.
2929

3030
Starting with version ``0.10`` of ``ansys-dpf-core``, the packages ``ansys-dpf-gate``,
@@ -34,8 +34,8 @@ and prevent synchronization issues between the PyDPF libraries, requiring to dro
3434
previous to 2022 R2.
3535

3636
**Ansys strongly encourages you to use the latest packages available**, as far they are compatible
37-
with the Server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
38-
module with 0.10.0 version is the latest available compatible package, it should be used.
37+
with the server version you want to run. Considering Ansys 2023 R1 for example, if ``ansys-dpf-core``
38+
module 0.10.0 is the latest available compatible package, it should be used.
3939

4040
For ``ansys-dpf-core<0.10``, the `ansys.grpc.dpf <https://pypi.org/project/ansys-grpc-dpf/>`_
4141
package should also be synchronized with the server version.

doc/source/getting_started/dpf_server.rst

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ The first standalone version of DPF Server available is 6.0 (2023 R2).
1616

1717
The sections on this page describe how to install and use a standalone DPF Server.
1818

19-
* For a quick start on using PyDPF, see :ref:`ref_getting_started`.
19+
* For a brief overview on using PyDPF, see :ref:`ref_getting_started`.
2020
* For more information on DPF and its use, see :ref:`ref_user_guide`.
2121

2222

@@ -65,7 +65,7 @@ PyDPF-Core is a Python client API communicating with a **DPF Server**, either
6565
through the network using gRPC or directly in the same process. PyDPF-Post is a Python
6666
module for postprocessing based on PyDPF-Core.
6767

68-
Both PyDPF-Core and PyDPF-Post can be used with DPF Server. Installation instructions
68+
Both PyDPF-Core and PyDPF-Post can be used with the DPF Server. Installation instructions
6969
for PyDPF-Core are available in the PyDPF-Core `Getting started <https://dpf.docs.pyansys.com/version/stable/getting_started/install.html>`_.
7070
Installation instructions for PyDPF-Post are available in the PyDPF-Post `Getting started <https://post.docs.pyansys.com/version/stable/getting_started/install.html>`_.
7171

@@ -98,10 +98,10 @@ to use thanks to its ``ansys_path`` argument.
9898
PyDPF otherwise follows the logic below to automatically detect and choose which locally installed
9999
version of DPF Server to run:
100100

101-
- it uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
102-
- it then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
103-
- it then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
104-
- if then raises an error if all of the steps above failed to return a valid path to a DPF Server installation.
101+
- It uses the ``ANSYS_DPF_PATH`` environment variable in priority if set and targeting a valid path to a DPF Server installation.
102+
- It then checks the currently active Python environment for any installed standalone DPF Server, and uses the latest version available.
103+
- It then checks for ``AWP_ROOTXXX`` environment variables, which are set by the **Ansys installer**, and uses the latest version available.
104+
- It then raises an error if all of the preceding steps failed to return a valid path to a DPF Server installation.
105105

106106
Run DPF Server in a Docker container
107107
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -111,7 +111,8 @@ DPF Server can be run in a Docker container.
111111
in :ref:`Install DPF Server <target_installing_server>`, download the ``Dockerfile`` file.
112112
#. Optional: download any other plugin ZIP file as appropriate. For example, to access the ``composites`` plugin for Linux,
113113
download ``ansys_dpf_composites_lin_v2025.1.pre0.zip``.
114-
#. Copy all the ZIP files and ``Dockerfile`` file in a folder and navigate into that folder.
114+
#. Copy all the ZIP files and the ``Dockerfile`` file into a folder.
115+
#. Navigate into the folder used in the previous step.
115116
#. To build the DPF Docker container, run the following command:
116117

117118
.. code::

doc/source/getting_started/index.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Getting started
66

77
The Data Processing Framework (DPF) provides numerical simulation users and engineers with a toolbox
88
for accessing and transforming simulation data. DPF can access data from Ansys solver
9-
result files as well as from several neutral (see :ref:`ref_main_index`).
9+
result files as well as from several neutral file formats. For more information, see :ref:`ref_main_index`.
1010

1111
This **workflow-based** framework allows you to perform complex preprocessing and
1212
postprocessing operations on large amounts of simulation data.

doc/source/getting_started/install.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,8 @@ with this command:
1616
1717
pip install ansys-dpf-core
1818
19-
PyDPF-Core plotting capabilities require to have `PyVista <https://pyvista.org/>`_ installed.
20-
To install PyDPF-Core with its optional plotting functionalities, use:
19+
PyDPF-Core plotting capabilities require you to have `PyVista <https://pyvista.org/>`_ installed.
20+
To install PyDPF-Core with its optional plotting functionalities, run this command:
2121

2222
.. code::
2323
@@ -62,10 +62,10 @@ then use the following command from within this local directory:
6262
6363
pip install --no-index --find-links=. ansys-dpf-core
6464
65-
Beware that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
66-
To allow for plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
65+
Note that PyDPF-Core wheelhouses do not include the optional plotting dependencies.
66+
To use the plotting capabilities, also download the wheels corresponding to your platform and Python interpreter version
6767
for `PyVista <https://pypi.org/project/pyvista/#files>`_ and
68-
`matplotlib <https://pypi.org/project/matplotlib/#files>`_, then place them in the same previous local directory and run the command above.
68+
`matplotlib <https://pypi.org/project/matplotlib/#files>`_. Then, place them in the same local directory and run the preceding command.
6969

7070

7171
Install in development mode

doc/source/getting_started/licensing.rst

Lines changed: 14 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,10 @@
44
Licensing
55
=========
66

7-
This section details how to properly set up licensing, as well as what the user should expect in
8-
terms of limitations or license usage when running PyDPF scripts.
7+
This section describes how to properly set up licensing, as well as limitations and license usage when running PyDPF scripts.
98

10-
DPF follows a client-server architecture, which means that the PyDPF client library must interact with a running DPF Server.
11-
It either starts a DPF Server via a local installation of DPF Server, or it connects to an already running local or remote DPF Server.
9+
DPF follows a client-server architecture, so the PyDPF client library must interact with a running DPF Server.
10+
It either starts a DPF Server via a local DPF Server installation, or it connects to an already running local or remote DPF Server.
1211

1312
DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.
1413
It is also available as a standalone application.
@@ -20,12 +19,12 @@ For more information on installing DPF Server, see :ref:`ref_dpf_server`.
2019
License terms
2120
-------------
2221

23-
When using the DPF Server from an Ansys installation, the user has already agreed to the licensing
22+
When using the DPF Server from an Ansys installation, you have already agreed to the licensing
2423
terms when installing Ansys.
2524

26-
When using a standalone DPF Server, the user must accept the ``DPF Preview License Agreement``
25+
When using a standalone DPF Server, you must accept the ``DPF Preview License Agreement``
2726
by following the indications below.
28-
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` throws an exception.
27+
Starting a DPF Server without agreeing to the ``DPF Preview License Agreement`` creates an exception.
2928

3029
DPF Preview License Agreement
3130
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -51,7 +50,7 @@ existing license for the edition and version of DPF Server that you intend to us
5150
Configure licensing
5251
-------------------
5352

54-
If your machine does not have a local Ansys installation, you need to define where DPF should look for a valid license.
53+
If your machine does not have a local Ansys installation, you must define where DPF should look for a valid license.
5554

5655
To use a local license file, set the ``ANSYSLMD_LICENSE_FILE`` environment
5756
variable to point to an Ansys license file ``<license_file_to_use>``:
@@ -85,12 +84,12 @@ License checks and usage
8584
------------------------
8685

8786
Some DPF operators require DPF to check for an existing license
88-
and some require DPF to check-out a compatible license increment.
87+
and some require DPF to checkout a compatible license increment.
8988

90-
DPF is by default allowed to check-out license increments as needed.
89+
DPF is by default allowed to checkout license increments as needed.
9190
To change this behavior, see :ref:`here <licensing_server_context>`.
9291

93-
To know if operators require a license increment check-out to run, check their ``license``
92+
To know if operators require a license increment checkout to run, check their ``license``
9493
attribute in :ref:`ref_dpf_operators_reference` or directly in Python by checking the operator's
9594
properties for a ``license`` key:
9695

@@ -109,13 +108,13 @@ properties for a ``license`` key:
109108
110109
111110
To check which Ansys licensing increments correspond to ``any_dpf_supported_increments``,
112-
see :ref:`here<target_to_ansys_license_increments_list>`.
111+
see :ref:`Compatible Ansys license increments<target_to_ansys_license_increments_list>`.
113112

114-
Even if an operator does not require a license check-out to run, most DPF operators still require
113+
Even if an operator does not require a license checkout to run, most DPF operators still require
115114
DPF to check for a reachable license server or license file.
116115

117-
Operators which do not perform any kind of license check are source operators (data extraction
118-
operators) which do not perform any data transformation.
116+
Operators that do not perform any kind of license check are source operators (data extraction
117+
operators). These operators do not perform any data transformation.
119118

120119
For example, when considering result operators, they perform data transformation if the requested
121120
location is not the native result location. In that case, averaging occurs which is considered

doc/source/operator_reference.rst

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -4,16 +4,7 @@
44
Operators
55
=========
66

7-
DPF operators provide for manipulating and transforming simulation data.
8-
9-
From DPF Server for Ansys 2023 R2 and later, the licensing logic for operators in DPF depend on the active
10-
`ServerContext <https://dpf.docs.pyansys.com/api/ansys.dpf.core.server_context.html#servercontext>`_.
11-
12-
The available contexts are **Premium** and **Entry**.
13-
Licensed operators are marked as in the documentation using the ``license`` property.
14-
Operators with the ``license`` property as **None** do not require a license check-out.
15-
For more information about using these two contexts, see :ref:`user_guide_server_context`.
16-
Click below to access the operators documentation.
7+
DPF operators allow you to manipulate and transform simulation data.
178

189
.. grid:: 1
1910

@@ -32,9 +23,17 @@ Click below to access the operators documentation.
3223
:click-parent:
3324

3425

26+
For Ansys 2023 R2 and later, the DPF Server licensing logic for operators in DPF depends on the active
27+
`server context<https://dpf.docs.pyansys.com/version/stable/api/ansys.dpf.core.server_context.html#ansys.dpf.core.server_context.ServerContext>`_.
28+
29+
The available contexts are **Premium** and **Entry**.
30+
Licensed operators are marked as such in the documentation using the ``license`` property.
31+
Operators with the ``license`` property set to **None** do not require a license checkout.
32+
For more information on using these two contexts, see :ref:`user_guide_server_context`.
33+
3534
.. note::
3635

37-
For Ansys 2023 R1 and earlier, the context is equivalent to Premium, with all operators loaded.
36+
For Ansys 2023 R1 and earlier, the context is equivalent to **Premium**, with all operators loaded.
3837
For DPF Server 2023.2.pre0 specifically, the server context defines which operators are loaded and
3938
accessible. Use the `PyDPF-Core 0.7 operator documentation <https://dpf.docs.pyansys.com/version/0.7/operator_reference.html>`_ to learn more.
4039
Some operators in the documentation might not be available for a particular server version.

doc/source/user_guide/concepts/concepts.rst

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,33 +3,33 @@
33
==================
44
Terms and concepts
55
==================
6-
DPF sees **fields of data**, not physical results. This makes DPF a
6+
DPF uses **fields of data**, not physical results. This makes DPF a
77
very versatile tool that can be used across teams, projects, and
88
simulations.
99

1010
Key terms
1111
---------
1212
Here are descriptions for key DPF terms:
1313

14-
- **Data source:** One or more files containing analysis results.
15-
- **Field:** Main simulation data container.
16-
- **Field container:** For a transient, harmonic, modal, or multi-step
14+
- **Data source**: One or more files containing analysis results.
15+
- **Field**: Main simulation data container.
16+
- **Fields container**: For a transient, harmonic, modal, or multi-step
1717
static analysis, a set of fields, with one field for each time step
1818
or frequency.
19-
- **Location:** Type of topology associated with the data container. DPF
19+
- **Location**: Type of topology associated with the data container. DPF
2020
uses three different spatial locations for finite element data: ``Nodal``,
2121
``Elemental``, and ``ElementalNodal``.
22-
- **Operators:** Objects that are used to create, transform, and stream the data.
22+
- **Operators**: Objects that are used to create, transform, and stream the data.
2323
An operator is composed of a **core** and **pins**. The core handles the
2424
calculation, and the pins provide input data to and output data from
2525
the operator.
26-
- **Scoping:** Spatial and/or temporal subset of a model's support.
27-
- **Support:** Physical entity that the field is associated with. For example,
26+
- **Scoping**: Spatial and/or temporal subset of a model's support.
27+
- **Support**: Physical entity that the field is associated with. For example,
2828
the support can be a mesh, geometrical entity, or time or frequency values.
29-
- **Workflow:** Global entity that is used to evaluate the data produced
29+
- **Workflow**: Global entity that is used to evaluate the data produced
3030
by chained operators.
31-
- **Meshed region:** Entity describing a mesh. Node and element scopings,
32-
element types, connectivity (list of node indices composing each element) and
31+
- **Meshed region**: Entity describing a mesh. Node and element scopings,
32+
element types, connectivity (list of node indices composing each element), and
3333
node coordinates are the fundamental entities composing the meshed region.
3434

3535
Scoping

doc/source/user_guide/concepts/stepbystep.rst

Lines changed: 17 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Data can come from two sources:
2323
- **Manual input in DPF:** You can create fields of data in DPF.
2424

2525
Once you specify data sources or manually create fields in DPF,
26-
you can create field containers (if applicable) and define scopings to
26+
you can create fields containers (if applicable) and define scopings to
2727
identify the subset of data that you want to evaluate.
2828

2929
Specify the data source
@@ -103,27 +103,27 @@ This code shows how to define a mesh scoping:
103103
my_scoping.location = "Nodal" #optional
104104
my_scoping.ids = list(range(1,11))
105105
106-
Define field containers
107-
~~~~~~~~~~~~~~~~~~~~~~~
108-
A **field container** holds a set of fields. It is used mainly for
106+
Define fields containers
107+
~~~~~~~~~~~~~~~~~~~~~~~~
108+
A **fields container** holds a set of fields. It is used mainly for
109109
transient, harmonic, modal, or multi-step analyses. This image
110110
explains its structure:
111111

112112
.. image:: ../../images/drawings/field-con-overview.png
113113

114-
A field container is a vector of fields. Fields are ordered with labels
115-
and IDs. Most commonly, a field container is scoped on the time label,
114+
A fields container is a vector of fields. Fields are ordered with labels
115+
and IDs. Most commonly, a fields container is scoped on the time label,
116116
and the IDs are the time or frequency sets:
117117

118118
.. image:: ../../images/drawings/field-con.png
119119

120-
You can define a field container in multiple ways:
120+
You can define a fields container in multiple ways:
121121

122122
- Extract labeled data from a result file.
123-
- Create a field container from a CSV file.
124-
- Convert existing fields to a field container.
123+
- Create a fields container from a CSV file.
124+
- Convert existing fields to a fields container.
125125

126-
This code shows how to define a field container from scratch:
126+
This code shows how to define a fields container from scratch:
127127

128128
.. code-block:: python
129129
@@ -137,9 +137,9 @@ This code shows how to define a field container from scratch:
137137
mscop = {"time":i+1,"complex":1}
138138
fc.add_field(mscop,dpf.Field(nentities=i+10))
139139
140-
Some operators can operate directly on field containers instead of fields.
141-
Field containers are identified by ``fc`` suffixes in their names.
142-
Operators and field containers are explained in more detail
140+
Some operators can operate directly on fields containers instead of fields.
141+
Fields containers are identified by ``fc`` suffixes in their names.
142+
Operators and fields containers are explained in more detail
143143
in :ref:`transform_the_data`.
144144

145145
.. _transform_the_data:
@@ -155,18 +155,16 @@ Use operators
155155
You use operators to import, export, transform, and analyze data.
156156

157157
An operator is analogous to an integrated circuit in electronics. It
158-
has a set of input and output pins. Pins provide for passing data to
159-
and from operators.
158+
has a set of input and output pins. Pins pass data to and from operators.
160159

161-
An operator takes input from a field, field container, or scoping using
160+
An operator takes input from a field, fields container, or scoping using
162161
an input pin. Based on what it is designed to do, the operator computes
163-
an output that it passes to a field or field container using an output pin.
162+
an output that it passes to a field or fields container using an output pin.
164163

165164
.. image:: ../../images/drawings/circuit.png
166165

167166
Comprehensive information on operators is available in :ref:`ref_dpf_operators_reference`.
168-
In the **Available Operators** area for either the **Entry** or **Premium** operators,
169-
you can either type a keyword in the **Search** option
167+
In the **Available Operators** area, you can either type a keyword in the **Search** option
170168
or browse by operator categories:
171169

172170
.. image:: ../../images/drawings/help-operators.png

0 commit comments

Comments
 (0)