Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
020c12e
Fix fast-debug in dataservice image entrypoint.
robertbartel Oct 3, 2022
9901d4f
Add package client name vars to example.env.
robertbartel Sep 13, 2022
371cd56
Add new QueryType for dataset management messages.
robertbartel Sep 13, 2022
df3d259
Add GET_SERIALIZED_FORM query dataservice support.
robertbartel Oct 3, 2022
2777322
Refactor ExternalRequestClient async_make_request.
robertbartel Oct 3, 2022
13486ba
Adding client methods for data/metadata retrieval.
robertbartel Oct 12, 2022
50b0ba1
Refactor dataset request handler to use try block.
robertbartel Sep 13, 2022
853a8c8
Add get_serialized_datasets to external client.
robertbartel Sep 13, 2022
7f4abae
Improve DatasetExternalClient.
robertbartel Oct 3, 2022
8a05d71
Bump dataservice dependencies.
robertbartel Sep 13, 2022
274719b
Fix LIST_FILES query response bug in dataservice.
robertbartel Oct 3, 2022
4f6da15
Update dataset manager abstract interface.
robertbartel Oct 12, 2022
a65ddfe
More QueryType elements and update DatasetQuery.
robertbartel Oct 12, 2022
c403de7
Update dataset message to support start and size.
robertbartel Oct 12, 2022
ca81028
Bump requestservice dependencies.
robertbartel Sep 13, 2022
f823d81
Bump requestservice to 0.6.0.
robertbartel Sep 13, 2022
cb61094
Bump communication package version to 0.10.1.
robertbartel Oct 12, 2022
0199609
Optimize object store dataset reloading.
robertbartel Oct 12, 2022
98ccf92
Update object store manager for interface changes.
robertbartel Oct 12, 2022
6670dd1
Update modeldata dependency versions.
robertbartel Oct 12, 2022
0f0cd00
Update client unit tests for interface changes.
robertbartel Oct 12, 2022
fc13804
Bump client package version to 0.2.0.
robertbartel Oct 12, 2022
2afa9bd
Update dataservice response handling.
robertbartel Oct 12, 2022
edd2622
Update dataservice package dependency versions.
robertbartel Oct 12, 2022
0d0ad3a
Add dataset view class and static page for GUI.
robertbartel Sep 13, 2022
a6b0aa6
Refactor static dir usage in DMODProxy.py.
robertbartel Sep 13, 2022
354d3c5
Add Django GUI url def for dataset view.
robertbartel Sep 13, 2022
9916b4d
Add client package build arg for GUI Docker build.
robertbartel Sep 13, 2022
14f675f
Add second view and navigation.
robertbartel Sep 29, 2022
57632b1
Have DatasetManagementView.py send DS as list.
robertbartel Sep 30, 2022
cf2256a
Update dataset management GUI with details view.
robertbartel Sep 30, 2022
ced58ed
Cleanup after changing implementation approach.
robertbartel Sep 30, 2022
09afa58
Create AbstractDatasetView GUI Django view class.
robertbartel Oct 3, 2022
5f0384d
Add Django view for direct dataservice API calls.
robertbartel Oct 3, 2022
45303a9
Add GUI Django url for Dataset API AJAX calls.
robertbartel Oct 3, 2022
e98ef36
Add commented-out helper debug volume for GUI.
robertbartel Oct 3, 2022
3a7e39f
Refactor DatasetManagementView inheritance.
robertbartel Oct 3, 2022
2c44f7e
More progress on dataset management template.
robertbartel Oct 3, 2022
ea10c34
Add dataset delete support to Django API view.
robertbartel Oct 3, 2022
9717569
Implement dataset delete in management view.
robertbartel Oct 3, 2022
8ce767f
Adding Javascript component helper classes.
robertbartel Oct 12, 2022
e47e730
Update dataset API view.
robertbartel Oct 12, 2022
9831d00
Update dataset management view template.
robertbartel Oct 12, 2022
7d81da7
Add incomplete DatasetFileWebsocketFilelike.
robertbartel Oct 13, 2022
953cfdd
Fix GUI numpy dep issue from upstream changes.
robertbartel Nov 23, 2022
508a0ba
Fix GUI migrations issue from upstream changes.
robertbartel Nov 23, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docker/main/dataservice/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ if [ -d ${UPDATED_PACKAGES_DIR:=/updated_packages} ]; then
for srv in $(pip -qq freeze | grep dmod | awk -F= '{print $1}' | awk -F- '{print $2}'); do
if [ $(ls ${UPDATED_PACKAGES_DIR} | grep dmod.${srv}- | wc -l) -eq 1 ]; then
pip uninstall -y --no-input $(pip -qq freeze | grep dmod.${srv} | awk -F= '{print $1}')
pip install $(ls ${UPDATED_PACKAGES_DIR}/*.whl | grep dmod.${srv}-)
pip install --no-deps $(ls ${UPDATED_PACKAGES_DIR}/*.whl | grep dmod.${srv}-)
fi
done
#pip install ${UPDATED_PACKAGES_DIR}/*.whl
Expand Down
3 changes: 3 additions & 0 deletions docker/nwm_gui/app_server/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ echo "Starting dmod app"
#Extract the DB secrets into correct ENV variables
POSTGRES_SECRET_FILE="/run/secrets/${DOCKER_SECRET_POSTGRES_PASS:?}"
export SQL_PASSWORD="$(cat ${POSTGRES_SECRET_FILE})"
export DMOD_SU_PASSWORD="$(cat ${POSTGRES_SECRET_FILE})"

python manage.py migrate

# Handle for debugging when appropriate
if [ "$(echo "${PYCHARM_REMOTE_DEBUG_ACTIVE:-false}" | tr '[:upper:]' '[:lower:]' | tr -d '[:space:]')" == "true" ]; then
Expand Down
5 changes: 5 additions & 0 deletions docker/nwm_gui/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ services:
args:
docker_internal_registry: ${DOCKER_INTERNAL_REGISTRY:?Missing DOCKER_INTERNAL_REGISTRY value (see 'Private Docker Registry ' section in example.env)}
comms_package_name: ${PYTHON_PACKAGE_DIST_NAME_COMMS:?}
client_package_name: ${PYTHON_PACKAGE_DIST_NAME_CLIENT:?}
networks:
- request-listener-net
# Call this when starting the container
Expand All @@ -57,11 +58,15 @@ services:
- SQL_USER=${DMOD_GUI_POSTGRES_USER:?}
- SQL_HOST=db
- SQL_PORT=5432
- DMOD_SU_NAME=dmod_super_user
- DMOD_SU_EMAIL=none@noaa.gov
- DATABASE=postgres
- DOCKER_SECRET_POSTGRES_PASS=postgres_password
volumes:
- ${DMOD_APP_STATIC:?}:/usr/maas_portal/static
- ${DMOD_SSL_DIR}/requestservice:/usr/maas_portal/ssl
# Needed only for speeding debugging
#- ${DOCKER_GUI_HOST_SRC:?GUI sources path not configured in environment}/MaaS:/usr/maas_portal/MaaS
#- ${DOCKER_GUI_HOST_VENV_DIR:-/tmp/blah}:${DOCKER_GUI_CONTAINER_VENV_DIR:-/tmp/blah}
# Expose Django's port to the internal network so that the web server may access it
expose:
Expand Down
5 changes: 5 additions & 0 deletions example.env
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,11 @@ TROUTE_BRANCH=ngen
## Python Packages Settings ##
########################################################################

## The "name" of the built client Python distribution package, for purposes of installing (e.g., via pip)
PYTHON_PACKAGE_DIST_NAME_CLIENT=dmod-client
## The name of the actual Python communication package (i.e., for importing or specifying as a module on the command line)
PYTHON_PACKAGE_NAME_CLIENT=dmod.client

## The "name" of the built communication Python distribution package, for purposes of installing (e.g., via pip)
PYTHON_PACKAGE_DIST_NAME_COMMS=dmod-communication
## The name of the actual Python communication package (i.e., for importing or specifying as a module on the command line)
Expand Down
29 changes: 29 additions & 0 deletions python/gui/MaaS/cbv/AbstractDatasetView.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
from abc import ABC
from django.views.generic.base import View
from dmod.client.request_clients import DatasetExternalClient
import logging
logger = logging.getLogger("gui_log")
from .DMODProxy import DMODMixin, GUI_STATIC_SSL_DIR
from typing import Dict


class AbstractDatasetView(View, DMODMixin, ABC):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get the desire to add blanket implementation here for downstream classes, but given that all methods and properties are implemented, is the use of abstract here to indicate to downstream users that they should inherit from this? Still coming up the speed on the codebase so this might be an idiom ive not picked up on yet.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Essentially yes. In particular, to reusably abstract a view that has a dataset client property. But it's not especially meaningful on it's own.


def __init__(self, *args, **kwargs):
super(AbstractDatasetView, self).__init__(*args, **kwargs)
self._dataset_client = None

async def get_dataset(self, dataset_name: str) -> Dict[str, dict]:
serial_dataset = await self.dataset_client.get_serialized_datasets(dataset_name=dataset_name)
return serial_dataset

async def get_datasets(self) -> Dict[str, dict]:
serial_datasets = await self.dataset_client.get_serialized_datasets()
return serial_datasets

@property
def dataset_client(self) -> DatasetExternalClient:
if self._dataset_client is None:
self._dataset_client = DatasetExternalClient(endpoint_uri=self.maas_endpoint_uri,
ssl_directory=GUI_STATIC_SSL_DIR)
return self._dataset_client
5 changes: 4 additions & 1 deletion python/gui/MaaS/cbv/DMODProxy.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
from pathlib import Path
from typing import List, Optional, Tuple, Type

GUI_STATIC_SSL_DIR = Path('/usr/maas_portal/ssl')
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we might want to mark this as a TODO in the future to make this configurable.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should also be handled by Django as well. Shouldn't need to be too necessary. Should be in settings.py

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! Thanks for catching this. A comment noting this implicit behavior would be appreciated!

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For some reason I thought I put that in settings.py.



class RequestFormProcessor(ABC):

Expand Down Expand Up @@ -209,7 +211,7 @@ class PostFormRequestClient(ModelExecRequestClient):
def _bootstrap_ssl_dir(cls, ssl_dir: Optional[Path] = None):
if ssl_dir is None:
ssl_dir = Path(__file__).resolve().parent.parent.parent.joinpath('ssl')
ssl_dir = Path('/usr/maas_portal/ssl') #Fixme
ssl_dir = GUI_STATIC_SSL_DIR #Fixme
return ssl_dir

def __init__(self, endpoint_uri: str, http_request: HttpRequest, ssl_dir: Optional[Path] = None):
Expand Down Expand Up @@ -315,6 +317,7 @@ def forward_request(self, request: HttpRequest, event_type: MessageEventType) ->
client = PostFormRequestClient(endpoint_uri=self.maas_endpoint_uri, http_request=request)
if event_type == MessageEventType.MODEL_EXEC_REQUEST:
form_processor_type = ModelExecRequestFormProcessor
# TODO: need a new type of form processor here (or 3 more, for management, uploading, and downloading)
else:
raise RuntimeError("{} got unsupported event type: {}".format(self.__class__.__name__, str(event_type)))

Expand Down
62 changes: 62 additions & 0 deletions python/gui/MaaS/cbv/DatasetApiView.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
import asyncio
from django.http import JsonResponse
from wsgiref.util import FileWrapper
from django.http.response import StreamingHttpResponse
from .AbstractDatasetView import AbstractDatasetView
from .DatasetFileWebsocketFilelike import DatasetFileWebsocketFilelike
import logging
logger = logging.getLogger("gui_log")


class DatasetApiView(AbstractDatasetView):

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)

def _get_dataset_content_details(self, dataset_name: str):
result = asyncio.get_event_loop().run_until_complete(self.dataset_client.get_dataset_content_details(name=dataset_name))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like there are several calls to asyncio.get_event_loop().run_until_complete. It might help to have some synchronous versions of these functions on the dataset client since there appears to be some intention to call them synchronously. Maybe.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The base functions for sending data through websockets are (and I am pretty sure must be) async, as are the functions making sure the client is authenticated (which relies on the former also). It would take a decent bit of changing to be able to do this and have everything work, but we can look more at that in the future.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems we might want to wrap these methods (all methods with leading _) with try except's and return something meaningful to the client if the request fails. I didn't go looking, but if the result has something like an ok property, we might want to return a non-200 here.

logger.info(result)
return JsonResponse({"contents": result}, status=200)

def _delete_dataset(self, dataset_name: str) -> JsonResponse:
result = asyncio.get_event_loop().run_until_complete(self.dataset_client.delete_dataset(name=dataset_name))
return JsonResponse({"successful": result}, status=200)

def _get_datasets_json(self) -> JsonResponse:
serial_dataset_map = asyncio.get_event_loop().run_until_complete(self.get_datasets())
return JsonResponse({"datasets": serial_dataset_map}, status=200)

def _get_dataset_json(self, dataset_name: str) -> JsonResponse:
serial_dataset = asyncio.get_event_loop().run_until_complete(self.get_dataset(dataset_name=dataset_name))
return JsonResponse({"dataset": serial_dataset[dataset_name]}, status=200)

def _get_download(self, request, *args, **kwargs):
dataset_name = request.GET.get("dataset_name", None)
item_name = request.GET.get("item_name", None)
chunk_size = 8192

custom_filelike = DatasetFileWebsocketFilelike(self.dataset_client, dataset_name, item_name)

response = StreamingHttpResponse(
FileWrapper(custom_filelike, chunk_size),
content_type="application/octet-stream"
)
response['Content-Length'] = asyncio.get_event_loop().run_until_complete(self.dataset_client.get_item_size(dataset_name, item_name))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

StreamingHttpResponses are a little wonky - I don't think they are compatible with the Content-Length header and will clash with Gunicorn. It might be better to have a websocket that emits this data and let the client build the blob.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have this part working anyway, so it can be implemented differently. I can pull this and other such things out entirely, but I wasn't sure if parts might be useful to show the direction I was trying to go.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's precisely why it should be left in for now. I just wanted to drop some notes while I was looking at it. I don't think any of us has the bandwidth to make major changes at the moment.

response['Content-Disposition'] = "attachment; filename=%s" % item_name
return response

def get(self, request, *args, **kwargs):
request_type = request.GET.get("request_type", None)
if request_type == 'download_file':
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This if chain leads me to believe that splitting this out across several smaller views might help.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did originally start to do that, but it occurred to me that this kind of violated the intent of having only a single websocket (at least per authenticated session) open for all client communication. In fairness, that isn't working properly at the moment, but I think we want to try to have it that way eventually, or else we'd move to REST. And if we are eventually going for that, I don't think we want too many different views (i.e., because eventually we'll have to consolidate them), although I could be assessing that incorrectly.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, if there's a plan to consolidate splitting them out doesn't make a whole lot of sense.

return self._get_download(request)
elif request_type == 'datasets':
return self._get_datasets_json()
elif request_type == 'dataset':
return self._get_dataset_json(dataset_name=request.GET.get("name", None))
elif request_type == 'contents':
return self._get_dataset_content_details(dataset_name=request.GET.get("name", None))
if request_type == 'delete':
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could probably be handled through a delete handler instead of a get handler. I don't know what all's going on in the workflow (like what the authentication looks like), but it looks like a user could possibly delete datasets through a browser address.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, but that should eventually get protected by our authentication and authorization mechanisms. And then this relates to my other comment about how we (probably) want to consolidate views now, or will eventually, for how we want our websocket communication to work.

return self._delete_dataset(dataset_name=request.GET.get("name", None))

# TODO: finish
return JsonResponse({}, status=400)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Django has a specialized response that might help here at django.http.HttpResponseBadRequest. If you use it it can help out when diagnosing issues in debug mode.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

^^
Related, do we want to provide any context as to why a 400 was the response? i.e. expected one of request_type: .

20 changes: 20 additions & 0 deletions python/gui/MaaS/cbv/DatasetFileWebsocketFilelike.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import asyncio
from typing import AnyStr
from dmod.client.request_clients import DatasetExternalClient


class DatasetFileWebsocketFilelike:

def __init__(self, client: DatasetExternalClient, dataset_name: str, file_name: str):
self._client = client
self._dataset_name = dataset_name
self._file_name = file_name
self._read_index: int = 0

def read(self, blksize: int) -> AnyStr:

result = asyncio.get_event_loop().run_until_complete(
Copy link
Copy Markdown
Member

@aaraney aaraney Oct 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to try catch here? If the download_item_block call raises an exception and a downstream user of DatasetFileWebsocketFilelike catches an exception, the state of their DatasetFileWebsocketFilelike could be corrupted or at least unclear what the behavior should be. Meaning, here, _read_index, if caught during a failed read, should continue from the failure point. Is this the expected behavior, or should _read_index be reset to 0?

self._client.download_item_block(dataset_name=self._dataset_name, item_name=self._file_name,
blk_start=self._read_index, blk_size=blksize))
self._read_index += blksize
return result
114 changes: 114 additions & 0 deletions python/gui/MaaS/cbv/DatasetManagementView.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
"""
Defines a view that may be used to configure a MaaS request
"""
import asyncio
from django.http import HttpRequest, HttpResponse
from django.shortcuts import render

import dmod.communication as communication
from dmod.core.meta_data import DataCategory, DataFormat

import logging
logger = logging.getLogger("gui_log")

from .utils import extract_log_data
from .AbstractDatasetView import AbstractDatasetView


class DatasetManagementView(AbstractDatasetView):
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thoughts on adding a delete method here to handle deleting a dataset? Or will that fall under the responsibility of post? Meaning, post would then likely handle creation, deletion, and modification.


"""
A view used to configure a dataset management request or requests for transmitting dataset data.
"""

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)

def _process_event_type(self, http_request: HttpRequest) -> communication.MessageEventType:
"""
Determine and return whether this request is for a ``DATASET_MANAGEMENT`` or ``DATA_TRANSMISSION`` event.

Parameters
----------
http_request : HttpRequest
The raw HTTP request in question.

Returns
-------
communication.MessageEventType
Either ``communication.MessageEventType.DATASET_MANAGEMENT`` or
``communication.MessageEventType.DATA_TRANSMISSION``.
"""
# TODO:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Issue to track this? Or will this be completed in this PR?

raise NotImplementedError("{}._process_event_type not implemented".format(self.__class__.__name__))

def get(self, http_request: HttpRequest, *args, **kwargs) -> HttpResponse:
"""
The handler for 'get' requests.

This will render the 'maas/dataset_management.html' template after retrieving necessary information to initially
populate the forms it displays.

Parameters
----------
http_request : HttpRequest
The request asking to render this page.
args
kwargs

Returns
-------
A rendered page.
"""
errors, warnings, info = extract_log_data(kwargs)

# Gather map of serialized datasets, keyed by dataset name
serial_dataset_map = asyncio.get_event_loop().run_until_complete(self.get_datasets())
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another asyncio.get_event_loop().run_until_complete call. It might help to have two get_datasets - get_datasets and get_datasets_async, with the former just calling the latter with the event loop wrapper.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess we can, but that seems cluttered. Not that I haven't already done that kind of cluttering in places ...

Do you think that would just enhance developer usability of the client class, or were there other advantages you were anticipating?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know that I'm the type of goober that would end up calling self.get_datasets() and forget to use the asyncio logic. That's just about the only thing that the change would address.

serial_dataset_list = [serial_dataset_map[d] for d in serial_dataset_map]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is nit-picky, but for code readability, serial_dataset_list = list( serial_dataset_map.values()) might be preferred.


dataset_categories = [c.name.title() for c in DataCategory]
dataset_formats = [f.name for f in DataFormat]

payload = {
'datasets': serial_dataset_list,
'dataset_categories': dataset_categories,
'dataset_formats': dataset_formats,
'errors': errors,
'info': info,
'warnings': warnings
}

return render(http_request, 'maas/dataset_management.html', payload)

def post(self, http_request: HttpRequest, *args, **kwargs) -> HttpResponse:
"""
The handler for 'post' requests.

This will attempt to submit the request and rerender the page like a 'get' request.

Parameters
----------
http_request : HttpRequest
The request asking to render this page.
args
kwargs

Returns
-------
A rendered page.
"""
# TODO: implement this to figure out whether DATASET_MANAGEMENT or DATA_TRANSMISSION
event_type = self._process_event_type(http_request)
client, session_data, dmod_response = self.forward_request(http_request, event_type)

# TODO: this probably isn't exactly correct, so review once closer to completion
if dmod_response is not None and 'dataset_id' in dmod_response.data:
session_data['new_dataset_id'] = dmod_response.data['dataset_id']

http_response = self.get(http_request=http_request, errors=client.errors, warnings=client.warnings,
info=client.info, *args, **kwargs)

for k, v in session_data.items():
http_response.set_cookie(k, v)

return http_response
Loading