@@ -19,7 +19,7 @@ In order to upload a single file, you need to:
19
19
20
20
transport = AIOHTTPTransport(url = ' YOUR_URL' )
21
21
22
- client = Client(transport = sample_transport )
22
+ client = Client(transport = transport )
23
23
24
24
query = gql('''
25
25
mutation($file: Upload!) {
@@ -46,7 +46,7 @@ It is also possible to upload multiple files using a list.
46
46
47
47
transport = AIOHTTPTransport(url = ' YOUR_URL' )
48
48
49
- client = Client(transport = sample_transport )
49
+ client = Client(transport = transport )
50
50
51
51
query = gql('''
52
52
mutation($files: [Upload!]!) {
@@ -69,51 +69,45 @@ It is also possible to upload multiple files using a list.
69
69
f2.close()
70
70
71
71
72
- Aiohttp StreamReader
73
- --------------------
72
+ Streaming
73
+ ---------
74
74
75
- In order to upload a aiohttp StreamReader, you need to:
75
+ If you use the above methods to send files, then the entire contents of the files
76
+ must be loaded in memory before the files are sent.
77
+ If the files are not too big and you have enough RAM, it is not a problem.
78
+ On another hand if you want to avoid using too much memory, then it is better
79
+ to read the files and send them in small chunks so that the entire file contents
80
+ don't have to be in memory at once.
76
81
77
- * get response from aiohttp request and then get StreamReader from `resp.content `
78
- * provide the StreamReader to the `variable_values ` argument of `execute `
79
- * set the `upload_files ` argument to True
82
+ We provide methods to do that for two different uses cases:
80
83
84
+ * Sending local files
85
+ * Streaming downloaded files from an external URL to the GraphQL API
81
86
82
- .. code-block :: python
87
+ Streaming local files
88
+ ^^^^^^^^^^^^^^^^^^^^^
83
89
84
- async with ClientSession() as client:
85
- async with client.get(' YOUR_URL' ) as resp:
86
- transport = AIOHTTPTransport(url = ' YOUR_URL' )
87
- client = Client(transport = transport)
88
- query = gql('''
89
- mutation($file: Upload!) {
90
- singleUpload(file: $file) {
91
- id
92
- }
93
- }
94
- ''' )
90
+ aiohttp allows to upload files using an asynchronous generator.
91
+ See `Streaming uploads on aiohttp docs `_.
95
92
96
- params = {" file" : resp.content}
97
93
98
- result = client.execute(
99
- query, variable_values = params, upload_files = True
100
- )
94
+ In order to stream local files, instead of providing opened files to the
95
+ ` variables_values ` argument of ` execute `, you need to provide an async generator
96
+ which will provide parts of the files.
101
97
102
- Asynchronous Generator
103
- ----------------------
98
+ You can use ` aiofiles `_
99
+ to read the files in chunks and create this asynchronous generator.
104
100
105
- In order to upload a single file use asynchronous generator(https://docs.aiohttp.org/en/stable/client_quickstart.html#streaming-uploads), you need to:
101
+ .. _Streaming uploads on aiohttp docs : https://docs.aiohttp.org/en/stable/client_quickstart.html#streaming-uploads
102
+ .. _aiofiles : https://github.com/Tinche/aiofiles
106
103
107
- * сreate a asynchronous generator
108
- * set the generator as a variable value in the mutation
109
- * provide the opened file to the `variable_values ` argument of `execute `
110
- * set the `upload_files ` argument to True
104
+ Example:
111
105
112
106
.. code-block :: python
113
107
114
108
transport = AIOHTTPTransport(url = ' YOUR_URL' )
115
109
116
- client = Client(transport = sample_transport )
110
+ client = Client(transport = transport )
117
111
118
112
query = gql('''
119
113
mutation($file: Upload!) {
@@ -123,7 +117,7 @@ In order to upload a single file use asynchronous generator(https://docs.aiohttp
123
117
}
124
118
''' )
125
119
126
- async def file_sender (file_name = None ):
120
+ async def file_sender (file_name ):
127
121
async with aiofiles.open(file_name, ' rb' ) as f:
128
122
chunk = await f.read(64 * 1024 )
129
123
while chunk:
@@ -134,4 +128,50 @@ In order to upload a single file use asynchronous generator(https://docs.aiohttp
134
128
135
129
result = client.execute(
136
130
query, variable_values = params, upload_files = True
137
- )
131
+ )
132
+
133
+ Streaming downloaded files
134
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
135
+
136
+ If the file you want to upload to the GraphQL API is not present locally
137
+ and needs to be downloaded from elsewhere, then it is possible to chain the download
138
+ and the upload in order to limit the amout of memory used.
139
+
140
+ Because the `content ` attribute of an aiohttp response is a `StreamReader `
141
+ (it provides an async iterator protocol), you can chain the download and the upload
142
+ together.
143
+
144
+ In order to do that, you need to:
145
+
146
+ * get the response from an aiohttp request and then get the StreamReader instance
147
+ from `resp.content `
148
+ * provide the StreamReader instance to the `variable_values ` argument of `execute `
149
+
150
+ Example:
151
+
152
+ .. code-block :: python
153
+
154
+ # First request to download your file with aiohttp
155
+ async with aiohttp.ClientSession() as http_client:
156
+ async with http_client.get(' YOUR_DOWNLOAD_URL' ) as resp:
157
+
158
+ # We now have a StreamReader instance in resp.content
159
+ # and we provide it to the variable_values argument of execute
160
+
161
+ transport = AIOHTTPTransport(url = ' YOUR_GRAPHQL_URL' )
162
+
163
+ client = Client(transport = transport)
164
+
165
+ query = gql('''
166
+ mutation($file: Upload!) {
167
+ singleUpload(file: $file) {
168
+ id
169
+ }
170
+ }
171
+ ''' )
172
+
173
+ params = {" file" : resp.content}
174
+
175
+ result = client.execute(
176
+ query, variable_values = params, upload_files = True
177
+ )
0 commit comments