You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copyedited the document on testing in our documentation (#1217)
* Copyediting testing.rst
* More copyediting
* Clarifications on how to generate control test data
* More copyediting
* removed one whitespace
* More fixes
* Minor fix
Co-authored-by: Benjamin Hackl <[email protected]>
Co-authored-by: Jason Villanueva <[email protected]>
Copy file name to clipboardExpand all lines: docs/source/contributing/testing.rst
+57-57Lines changed: 57 additions & 57 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,23 +1,27 @@
1
1
============
2
2
Adding Tests
3
3
============
4
-
When adding a new feature, it should always be tested. Tests prevent
5
-
manim from breaking at each new feature added by checking if any other
4
+
If you are adding new features to manim, you should add appropriate tests for them. Tests prevent
5
+
manim from breaking at each change by checking that no other
6
6
feature has been broken and/or been unintentionally modified.
7
7
8
8
How Manim Tests
9
9
---------------
10
10
11
-
To conduct our tests, we use ``pytest``. Running ``pytest`` in the root of
12
-
the project will start the testing process, and will show if there is
13
-
something wrong.
11
+
Manim uses pytest as its testing framework.
12
+
To start the testing process, go to the root directory of the project and run pytest in your terminal.
13
+
Any errors that occur during testing will be displayed in the terminal.
14
14
15
15
Some useful pytest flags:
16
-
- ``-x``, that will make pytest stop at the first fail,
17
-
- ``-s``, that will make pytest display all the print messages (including those during scene generation, like DEBUG messages).
18
-
- ``--skip_slow`` will skip the (arbitrarly) slow tests.
19
-
- ``--show_diff`` will show a visual comparison in case an unit test is
20
-
failing.
16
+
17
+
- ``-x`` will make pytest stop at the first failure it encounters
18
+
19
+
- ``-s`` will make pytest display all the print messages (including those during scene generation, like DEBUG messages)
20
+
21
+
- ``--skip_slow`` will skip the (arbitrarily) slow tests
22
+
23
+
- ``--show_diff`` will show a visual comparison in case an unit test is failing.
24
+
21
25
22
26
How it Works
23
27
~~~~~~~~~~~~
@@ -26,30 +30,28 @@ At the moment there are three type of tests:
26
30
27
31
#. Unit Tests:
28
32
29
-
Basically test for pretty much everything. For example, there a test for
30
-
``Mobject``, that checks if it can be added to a Scene, etc ..
33
+
Tests for most of the basic functionalities of manim. For example, there a test for
34
+
``Mobject``, that checks if it can be added to a Scene, etc.
31
35
32
36
#. Graphical unit tests:
33
37
34
-
Because ``manim`` is a video library, we tests frames. To do so, we take a
35
-
frame of control data for each feature and compare the last frame of the
36
-
feature rendered (in the form of a numpy array). If it matches, the tests
37
-
are successful. If one wants to visually see the what has changed, you can
38
-
use ``--show_diff`` flag along with ``pytest`` to be able to visualize
39
-
what is different.
38
+
Because ``manim`` is a graphics library, we test frames. To do so, we create test scenes that render a specific feature.
39
+
When pytest runs, it compares the last frame of every render to the control data; If it matches, the tests
40
+
pass. If the test and control data differ, the tests fail. You can
41
+
use ``--show_diff`` flag with ``pytest`` to visually see the differences.
40
42
41
43
#. Videos format tests:
42
44
43
-
As Manim is a video library, we have to test videos as well. Unfortunalty,
44
-
we can't test directly video content as manim outputs videos that can
45
-
differ slightly from one system to another (for reasons related to
46
-
ffmpeg). As such, we just compare videos configuration values, exported in
45
+
As Manim is a video library, we have to test videos as well. Unfortunately,
46
+
we cannot directly test video content as rendered videos can
47
+
differ slightly depending on system (for reasons related to
48
+
ffmpeg). Therefore, we only compare video configuration values, exported in
47
49
.json.
48
50
49
51
Architecture
50
52
------------
51
53
52
-
``manim/tests`` directory looks like this:
54
+
The ``manim/tests`` directory looks like this:
53
55
54
56
::
55
57
@@ -123,29 +125,27 @@ The Main Directories
123
125
124
126
- ``control_data/``:
125
127
126
-
Here control data is saved. These are generally frames
127
-
that we expect to see. In ``control_data/graphical_units_data/`` are all the
128
-
.npz (represented the last frame) used in graphical unit tests videos, and in
129
-
``control_data/videos_data/`` some .json used to check videos.
128
+
The directory containing control data. ``control_data/graphical_units_data/`` contains the expected and correct frame data for graphical tests, and
129
+
``control_data/videos_data/`` contains the .json files used to check videos.
130
130
131
131
- ``test_graphical_units/``:
132
132
133
-
For tests related to visual items that can appear in media
133
+
Contains graphical tests.
134
134
135
135
- ``test_scene_rendering/``:
136
136
137
-
For tests that need to render a scene in a way or another. For example, CLI
137
+
For tests that need to render a scene in some way, such as tests for CLI
138
138
flags (end-to-end tests).
139
139
140
140
- ``utils/``:
141
141
142
-
Useful internal functions used by pytest to test.
142
+
Useful internal functions used by pytest.
143
143
144
144
.. Note:: fixtures are not contained here, they are in ``conftest.py``.
145
145
146
146
- ``helpers/``:
147
147
148
-
Helper function for developers to setup graphical/video tests.
148
+
Helper functions for developers to setup graphical/video tests.
149
149
150
150
Adding a New Test
151
151
-----------------
@@ -154,21 +154,21 @@ Unit Tests
154
154
~~~~~~~~~~
155
155
156
156
Pytest determines which functions are tests by searching for files whose
157
-
names begin with "test\_" and then within those files for functions
158
-
beginning with "test" or classes beginning with "Test". These kind of
157
+
names begin with "test\_", and then within those files for functions
158
+
beginning with "test" and classes beginning with "Test". These kind of
159
159
tests must be in ``tests/`` (e.g. ``tests/test_container.py``).
160
160
161
161
Graphical Unit Test
162
162
~~~~~~~~~~~~~~~~~~~
163
163
164
-
The test must be written in the correct file and follow the structure
164
+
The test must be written in the correct file (i.e. the file that corresponds to the appropriate category the feature belongs to) and follow the structure
165
165
of unit tests.
166
166
167
167
For example, to test the ``Circle`` VMobject which resides in
168
168
``manim/mobject/geometry.py``, add the CircleTest to
169
169
``test/test_geometry.py``.
170
170
171
-
In ``test_geometry.py``:
171
+
In ``test_geometry.py``, add:
172
172
173
173
.. code:: python
174
174
@@ -183,7 +183,7 @@ we are testing whether Circle properly shows up with the generic
183
183
184
184
.. Note::
185
185
186
-
If the file already exists, just add to its content. The
186
+
If the file already exists, edit it and add the test within the file. The
187
187
``Scene`` will be tested thanks to the ``GraphicalUnitTester`` that lives
188
188
in ``tests/utils/GraphicalUnitTester.py``. Import it with ``from
0 commit comments