@@ -19,7 +19,7 @@ In order to upload a single file, you need to:
19
19
20
20
transport = AIOHTTPTransport(url = ' YOUR_URL' )
21
21
22
- client = Client(transport = sample_transport )
22
+ client = Client(transport = transport )
23
23
24
24
query = gql('''
25
25
mutation($file: Upload!) {
@@ -46,7 +46,7 @@ It is also possible to upload multiple files using a list.
46
46
47
47
transport = AIOHTTPTransport(url = ' YOUR_URL' )
48
48
49
- client = Client(transport = sample_transport )
49
+ client = Client(transport = transport )
50
50
51
51
query = gql('''
52
52
mutation($files: [Upload!]!) {
@@ -67,3 +67,111 @@ It is also possible to upload multiple files using a list.
67
67
68
68
f1.close()
69
69
f2.close()
70
+
71
+
72
+ Streaming
73
+ ---------
74
+
75
+ If you use the above methods to send files, then the entire contents of the files
76
+ must be loaded in memory before the files are sent.
77
+ If the files are not too big and you have enough RAM, it is not a problem.
78
+ On another hand if you want to avoid using too much memory, then it is better
79
+ to read the files and send them in small chunks so that the entire file contents
80
+ don't have to be in memory at once.
81
+
82
+ We provide methods to do that for two different uses cases:
83
+
84
+ * Sending local files
85
+ * Streaming downloaded files from an external URL to the GraphQL API
86
+
87
+ Streaming local files
88
+ ^^^^^^^^^^^^^^^^^^^^^
89
+
90
+ aiohttp allows to upload files using an asynchronous generator.
91
+ See `Streaming uploads on aiohttp docs `_.
92
+
93
+
94
+ In order to stream local files, instead of providing opened files to the
95
+ `variables_values ` argument of `execute `, you need to provide an async generator
96
+ which will provide parts of the files.
97
+
98
+ You can use `aiofiles `_
99
+ to read the files in chunks and create this asynchronous generator.
100
+
101
+ .. _Streaming uploads on aiohttp docs : https://docs.aiohttp.org/en/stable/client_quickstart.html#streaming-uploads
102
+ .. _aiofiles : https://github.com/Tinche/aiofiles
103
+
104
+ Example:
105
+
106
+ .. code-block :: python
107
+
108
+ transport = AIOHTTPTransport(url = ' YOUR_URL' )
109
+
110
+ client = Client(transport = transport)
111
+
112
+ query = gql('''
113
+ mutation($file: Upload!) {
114
+ singleUpload(file: $file) {
115
+ id
116
+ }
117
+ }
118
+ ''' )
119
+
120
+ async def file_sender (file_name ):
121
+ async with aiofiles.open(file_name, ' rb' ) as f:
122
+ chunk = await f.read(64 * 1024 )
123
+ while chunk:
124
+ yield chunk
125
+ chunk = await f.read(64 * 1024 )
126
+
127
+ params = {" file" : file_sender(file_name = ' YOUR_FILE_PATH' )}
128
+
129
+ result = client.execute(
130
+ query, variable_values = params, upload_files = True
131
+ )
132
+
133
+ Streaming downloaded files
134
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^
135
+
136
+ If the file you want to upload to the GraphQL API is not present locally
137
+ and needs to be downloaded from elsewhere, then it is possible to chain the download
138
+ and the upload in order to limit the amout of memory used.
139
+
140
+ Because the `content ` attribute of an aiohttp response is a `StreamReader `
141
+ (it provides an async iterator protocol), you can chain the download and the upload
142
+ together.
143
+
144
+ In order to do that, you need to:
145
+
146
+ * get the response from an aiohttp request and then get the StreamReader instance
147
+ from `resp.content `
148
+ * provide the StreamReader instance to the `variable_values ` argument of `execute `
149
+
150
+ Example:
151
+
152
+ .. code-block :: python
153
+
154
+ # First request to download your file with aiohttp
155
+ async with aiohttp.ClientSession() as http_client:
156
+ async with http_client.get(' YOUR_DOWNLOAD_URL' ) as resp:
157
+
158
+ # We now have a StreamReader instance in resp.content
159
+ # and we provide it to the variable_values argument of execute
160
+
161
+ transport = AIOHTTPTransport(url = ' YOUR_GRAPHQL_URL' )
162
+
163
+ client = Client(transport = transport)
164
+
165
+ query = gql('''
166
+ mutation($file: Upload!) {
167
+ singleUpload(file: $file) {
168
+ id
169
+ }
170
+ }
171
+ ''' )
172
+
173
+ params = {" file" : resp.content}
174
+
175
+ result = client.execute(
176
+ query, variable_values = params, upload_files = True
177
+ )
0 commit comments