diff --git a/IoT/README.md b/IoT/README.md
new file mode 100644
index 0000000..e01cfbf
--- /dev/null
+++ b/IoT/README.md
@@ -0,0 +1,10 @@
+# SAS REST API Examples
+
+## IoT
+
+This repository contains SAS contributed examples that show the capabilities of the IoT APIs. You can use these examples for learning or for validating your environment.
+
+You are encouraged to contribute your own examples in the [User Contributions repository](../User_and_Aggregated_Samples).
+
+## Examples
+* [Event Stream Processing](eventStreamProcessing.md)
diff --git a/IoT/eventStreamProcessing.md b/IoT/eventStreamProcessing.md
new file mode 100644
index 0000000..87861af
--- /dev/null
+++ b/IoT/eventStreamProcessing.md
@@ -0,0 +1,12 @@
+# IoT Event Stream Processing APIs
+
+This repository contains SAS contributed examples that show the capabilities of the IoT Event Stream Processing APIs.
+
+* SAS Event Stream Manager is a web-based client that enables you to deploy projects and monitor the state of your SAS Event Stream Processing environment.
+You can use SAS Event Stream Processing Studio to create the projects that you deploy to event stream processing (ESP) servers using SAS Event Stream Manager.
+
+* SAS Event Stream Processing Studio is a web-based client that enables you to create, test, and publish event stream processing models.
+Publishing a project version from SAS Event Stream Processing Studio makes the project version available to SAS Event Stream Manager.
+
+## Examples
+* [Import, publish and run projects on Kubernetes cluster](eventStreamProcessing/python/import-and-publish-projects/README.md)
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/README.md b/IoT/eventStreamProcessing/python/import-and-publish-projects/README.md
new file mode 100644
index 0000000..9320659
--- /dev/null
+++ b/IoT/eventStreamProcessing/python/import-and-publish-projects/README.md
@@ -0,0 +1,56 @@
+# Import, Publish, and Run Projects in a Kubernetes Cluster
+
+## Overview
+
+This Jupyter notebook uses the SAS Event Stream Processing Studio Rest API and the SAS Event Stream Manager Rest API to
+import, publish, and run projects in a Kubernetes cluster.
+
+Use this example when SAS Event Stream Processing is deployed with other SAS Viya platform products,
+and you are using SAS Event Stream Processing 2023.08 or later.
+
+This Jupyter notebook uses the project XML files in the `xml_projects` folder and performs the following tasks:
+1. Check whether the projects have already been imported to SAS Event Stream Processing Studio.
+2. If a project has been previously imported, then import it using the next version number. Otherwise, import the project as version 1.
+3. Make the projects public so that they are visible to all users.
+4. Publish the projects from SAS Event Stream Processing Studio to SAS Event Stream Manager.
+5. Synchronize the projects.
+6. Create a SAS Event Stream Manager deployment whose type is "Cluster".
+7. Run the projects in the Kubernetes cluster. This action creates and starts an ESP server for each project.
+
+## Check Prerequisites
+
+Ensure that Python 3 is available and the following Python packages are available:
+- requests
+- json
+- epoch
+- urllib3
+- ElementTree
+- time
+
+## Using the Example
+1. Download the Python program or the Jupyter notebook file.
+2. Ensure that the `xml_projects` folder (which contains the project XML files used by this example) is located at the same level as this Jupyter notebook.
+3. Edit the following variables to match your environment:
+ - `server`: the URL of the SAS Viya platform server
+ - `username`
+ - `password`
+ - `chosen_deployment_name`: specify a name for the SAS Event Stream Manager deployment that this example creates
+4. Run the program or Jupyter notebook commands.
+
+## Endpoints Used by This Example
+
+_**Note:** The IoT SAS Event Stream Processing Studio Rest API and the SAS Event Stream Manager Rest API documentation
+is currently for internal use only and not available to external customers on developers.sas.com.
+You must be signed in with your SAS email address to access the content on the external site._
+
+### SAS Event Stream Processing Studio
+- [/esp-project](https://developers.sas.com/rest-apis/SASEventStreamProcessingStudio-v3?operation=getListModels) - List all projects
+- [/esp-project](https://developers.sas.com/rest-apis/SASEventStreamProcessingStudio-v3?operation=createProject) - Create a project
+- [/project-versions/projects/{projectId}/nextVersion](https://developers.sas.com/rest-apis/SASEventStreamProcessingStudio-v3?operation=getNextVersion) - Get the next version of a project
+- [/project-versions/projects/{projectId}](https://developers.sas.com/rest-apis/SASEventStreamProcessingStudio-v3?operation=createVersion) - Publish a file to the Files service
+- [/project-versions/projects/synchronize/{folderId}](https://developers.sas.com/rest-apis/SASEventStreamProcessingStudio-v3?operation=createSynchronizeProject) - Synchronize a published project with SAS Event Stream Manager
+
+### SAS Event Stream Manager
+- [/deployment](https://developers.sas.com/rest-apis/SASEventStreamManager-v3?operation=getListDeployments) - List all deployments
+- [/deployment](https://developers.sas.com/rest-apis/SASEventStreamManager-v3?operation=createDeployment) - Create a deployment
+- [/server/cluster](https://developers.sas.com/rest-apis/SASEventStreamManager-v3?operation=createStartProjectOnCluster) - Run a project in a cluster
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_in_ESM.png b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_in_ESM.png
new file mode 100644
index 0000000..81b8f3f
Binary files /dev/null and b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_in_ESM.png differ
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_with_projects_running_in_ESM.png b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_with_projects_running_in_ESM.png
new file mode 100644
index 0000000..f80947e
Binary files /dev/null and b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/deployment_with_projects_running_in_ESM.png differ
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/images/imported_projects_in_Studio.png b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/imported_projects_in_Studio.png
new file mode 100644
index 0000000..5d14dc2
Binary files /dev/null and b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/imported_projects_in_Studio.png differ
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/images/testing_imported_project_in_Studio.png b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/testing_imported_project_in_Studio.png
new file mode 100644
index 0000000..62283f9
Binary files /dev/null and b/IoT/eventStreamProcessing/python/import-and-publish-projects/images/testing_imported_project_in_Studio.png differ
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/import-and-publish-projects.ipynb b/IoT/eventStreamProcessing/python/import-and-publish-projects/import-and-publish-projects.ipynb
new file mode 100644
index 0000000..334b34a
--- /dev/null
+++ b/IoT/eventStreamProcessing/python/import-and-publish-projects/import-and-publish-projects.ipynb
@@ -0,0 +1,505 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Import and Publish Projects\n",
+ "This Jupyter notebook uses the SAS Event Stream Processing Studio Rest API and the SAS Event Stream Manager Rest API to import, publish, and run projects in a Kubernetes cluster.\n",
+ "\n",
+ "\n",
+ "Prerequisite: This example assumes that there is an *xml_projects* folder at same level as this Jupyter notebook, which contains XML files that you want to import into SAS Event Stream Processing Studio and publish to SAS Event Stream Manager.\n",
+ "\n",
+ "This Jupyter notebook uses the project XML files in the `xml_projects` folder and performs the following tasks:\n",
+ "1. Check whether the projects have already been imported to SAS Event Stream Processing Studio.\n",
+ "2. If a project has been previously imported, then import it using the next version number. Otherwise, import the project as version 1.\n",
+ "3. Make the projects public so that they are visible to all users.\n",
+ "4. Publish the projects from SAS Event Stream Processing Studio to SAS Event Stream Manager.\n",
+ "5. Synchronize the projects.\n",
+ "6. Create a SAS Event Stream Manager deployment whose type is \"Cluster\".\n",
+ "7. Run the projects in the Kubernetes cluster. This action creates and starts an ESP server for each project.\n",
+ "\n",
+ " \n",
+ "**Important**: You must run the [Imports and Global Variables](#imports) cell before executing anything else in this notebook. Also ensure that you are authenticated by running the [Get Access Token](#authentication) cell.\n"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Imports and Global Variables \n",
+ "Run this cell before any of the others as it imports packages and sets variables that are used throughout this notebook."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import requests\n",
+ "import xml.etree.ElementTree as ET\n",
+ "import os\n",
+ "import json\n",
+ "import time\n",
+ "import sys\n",
+ "from urllib3.exceptions import InsecureRequestWarning\n",
+ "\n",
+ "\n",
+ "def bootstrap_server_and_credentials():\n",
+ " global server, username, password, chosen_deployment_name\n",
+ " server = \"https://your_server\"\n",
+ " username = \"your_username\"\n",
+ " password = \"your_password\"\n",
+ " chosen_deployment_name = \"test\"\n",
+ " if len(sys.argv) > 3:\n",
+ " server = sys.argv[1]\n",
+ " username = sys.argv[2]\n",
+ " password = sys.argv[3]\n",
+ "\n",
+ " print('Server: ' + server)\n",
+ " print('Username: ' + username)\n",
+ " print('Password: ' + password, flush=True)\n",
+ " print('Name of SAS Event Stream Manager Deployment: ' + chosen_deployment_name, flush=True)\n",
+ "\n",
+ "# Suppress ssl warnings caused by verify=False\n",
+ "requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning)\n",
+ "bootstrap_server_and_credentials()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Get Access Token "
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "scrolled": true
+ },
+ "outputs": [],
+ "source": [
+ "def get_access_token():\n",
+ " body = {'grant_type': 'password', 'username': username, 'password': password}\n",
+ " headers = {'Content-Type': 'application/x-www-form-urlencoded', 'Authorization': 'Basic c2FzLmVjOg=='}\n",
+ " access_token_response = requests.post(server + '/SASLogon/oauth/token', data=body, headers=headers,\n",
+ " verify=False)\n",
+ " return access_token_response.json()[\"access_token\"]\n",
+ "\n",
+ "access_token = get_access_token()\n",
+ "print(access_token)\n",
+ "headers = {\"Content-Type\": \"application/json\", \"Authorization\": \"Bearer \" + access_token}"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Get ESP Projects"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 5,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def get_projects():\n",
+ " projects = []\n",
+ " get_projects_response = requests.get(\n",
+ " server + '/SASEventStreamProcessingStudio/esp-project', headers=headers, verify=False)\n",
+ " if get_projects_response.status_code == 200:\n",
+ " projects = get_projects_response.json()[\"items\"]\n",
+ " return projects"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Import a Project to SAS Event Stream Processing Studio"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 6,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def import_project_to_studio(project_body):\n",
+ " project_id = \"\"\n",
+ " import_project_response = requests.post(server + '/SASEventStreamProcessingStudio/esp-project',\n",
+ " data=json.dumps(project_body), headers=headers, verify=False)\n",
+ " if import_project_response.status_code == 200:\n",
+ " project_id = import_project_response.json()[\"flowId\"]\n",
+ " print('success, project_id=' + project_id)\n",
+ " return project_id"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Get the Next Version Number of a Project"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 7,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def get_next_project_version(project_id):\n",
+ " version = 2\n",
+ " get_next_version_response = requests.get(\n",
+ " server + '/SASEventStreamProcessingStudio/project-versions/projects/' + project_id + '/nextVersion',\n",
+ " headers=headers, verify=False)\n",
+ " if get_next_version_response.status_code == 200:\n",
+ " version = get_next_version_response.json()[\"major\"]\n",
+ " else:\n",
+ " print('Failed to get next version')\n",
+ " print(get_next_version_response)\n",
+ " return version"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Make a Project Public\n",
+ "By default, a project is private and hidden from other users."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 8,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def make_project_public(project_id):\n",
+ " requests.patch(server + '/SASEventStreamProcessingStudio/esp-project/'\n",
+ " + project_id + '/authorization?private=false',\n",
+ " headers={'Authorization': 'Bearer ' + access_token}, verify=False)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Create the Expected Project Model to be Published"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 9,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def create_publish_project_body(project_body, version):\n",
+ " project_body[\"name\"] = project_body[\"friendlyName\"]\n",
+ " project_body[\"description\"] = \"\"\n",
+ " project_body[\"friendlyName\"] = None\n",
+ " project_body[\"majorVersion\"] = str(version)\n",
+ " project_body[\"minorVersion\"] = \"0\"\n",
+ " project_body[\"version\"] = str(version) + '.0'\n",
+ " project_body[\"versionNotes\"] = \"notes\"\n",
+ " project_body[\"uploadedBy\"] = username\n",
+ " project_body[\"modifiedBy\"] = username\n",
+ " epoch_time = int(time.time())\n",
+ " project_body[\"uploaded\"] = epoch_time\n",
+ " project_body[\"modified\"] = epoch_time + 10\n",
+ " project_body[\"isDeployable\"] = False"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Publish a Project"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 10,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def publish_project(project_id, project_body, version):\n",
+ " folder_id = \"\"\n",
+ " create_publish_project_body(project_body, version)\n",
+ " publish_project_response = requests.post(\n",
+ " server + '/SASEventStreamProcessingStudio/project-versions/projects/' + project_id,\n",
+ " data=json.dumps(project_body), headers=headers, verify=False)\n",
+ "\n",
+ " if publish_project_response.status_code == 200:\n",
+ " folder_id = publish_project_response.json()[\"folderId\"]\n",
+ " else:\n",
+ " print('PUBLISH FAILED')\n",
+ " print(publish_project_response)\n",
+ " print('Version:' + str(version))\n",
+ " return folder_id"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Synchronize Projects from SAS Event Stream Processing Studio to SAS Event Stream Manager"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 11,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def synchronize_project_for_ESM(folder_id):\n",
+ " success = False\n",
+ " synchronize_project_response = requests.post(\n",
+ " server + '/SASEventStreamProcessingStudio/project-versions/projects/synchronize/' + folder_id,\n",
+ " data=folder_id, headers=headers, verify=False)\n",
+ "\n",
+ " if synchronize_project_response.status_code == 200:\n",
+ " success = True\n",
+ " return success"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Get Deployment Details Needed to Start Kubernetes Cluster in SAS Event Stream Manager"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 12,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def get_deployment_details():\n",
+ " success = True\n",
+ " deployment_id = ''\n",
+ " deployment_name = ''\n",
+ " projects_running_on_deployment = []\n",
+ " deployments_response = requests.get(server + \"/SASEventStreamManager/deployment?noDetails=false\",\n",
+ " headers=headers, verify=False)\n",
+ "\n",
+ " if deployments_response.status_code != 200:\n",
+ " print(\"Could not find any deployments\", deployments_response.text)\n",
+ " success = False\n",
+ "\n",
+ " # Here we try to start clusters against the hard-coded deployment name\n",
+ " # if it does not exist, it will spin off cluster against 1st cluster\n",
+ " deployment_items = deployments_response.json()[\"items\"]\n",
+ " if len(deployment_items) > 0:\n",
+ " deployment_id = deployment_items[0][\"uuid\"]\n",
+ " deployment_name = deployment_items[0][\"label\"]\n",
+ " projects_running_on_deployment = get_project_names_from_deployment(deployment_items[0])\n",
+ " \n",
+ " for deployment in deployment_items:\n",
+ " if deployment[\"type\"] == \"cluster\" and deployment[\"label\"] == chosen_deployment_name:\n",
+ " deployment_id = deployment[\"uuid\"]\n",
+ " deployment_name = deployment[\"label\"]\n",
+ " projects_running_on_deployment = get_project_names_from_deployment(deployment)\n",
+ " else:\n",
+ " print(\"Could not find any deployments\", deployments_response.text)\n",
+ " success = False\n",
+ "\n",
+ " return success, deployment_id, deployment_name, projects_running_on_deployment\n",
+ "\n",
+ "def get_project_names_from_deployment(deployment):\n",
+ " project_names = []\n",
+ " for server in deployment[\"servers\"]:\n",
+ " project_names = project_names + list(map(lambda project: project[\"name\"], server[\"projects\"]))\n",
+ " return project_names"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Function to Start Kubernetes Cluster in SAS Event Stream Manager"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 13,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def start_k8s_cluster(k8s_project_body):\n",
+ " success, deployment_id, deployment_name, projects_running_on_deployment = get_deployment_details()\n",
+ "\n",
+ " if k8s_project_body[\"name\"] in projects_running_on_deployment:\n",
+ " print(\"Project \" + k8s_project_body[\"name\"] + \" is already running on deployment \" + deployment_name +\".\")\n",
+ " elif success:\n",
+ " # Here are the deployment settings hard-coded for all projects\n",
+ " k8s_deployment_settings = {\n",
+ " \"persistentVolumeClaim\": \"sas-event-stream-processing-studio-app\",\n",
+ " \"requestsMemory\": \"1Gi\",\n",
+ " \"requestsCpu\": 1,\n",
+ " \"requestsGpu\": \"0\",\n",
+ " \"limitsMemory\": \"1Gi\",\n",
+ " \"limitsCpu\": 1,\n",
+ " \"limitsGpu\": \"0\",\n",
+ " \"minReplicas\": 1,\n",
+ " \"maxReplicas\": 1,\n",
+ " \"useLoadBalancer\": False,\n",
+ " \"loadBalancingPolicy\": \"none\",\n",
+ " \"averageUtilization\": \"50\",\n",
+ " \"loadBalancerTargetsList\": []\n",
+ " }\n",
+ " k8s_cluster_body = {\n",
+ " \"distinguisher\": deployment_name,\n",
+ " \"esmDeploymentId\": deployment_id,\n",
+ " \"gpuReliant\": False,\n",
+ " \"loadOnly\": False,\n",
+ " \"project\": k8s_project_body,\n",
+ " \"settings\": k8s_deployment_settings\n",
+ " }\n",
+ "\n",
+ " response = requests.post(server + \"/SASEventStreamManager/server/cluster\",\n",
+ " data=json.dumps(k8s_cluster_body), headers=headers, verify=False)\n",
+ " if response.status_code != 200:\n",
+ " print(\"error creating cluster\", response.text)\n",
+ " else:\n",
+ " print(\"Project \" + k8s_project_body[\"name\"], \" successfully started K8s pod in deployment: \" + deployment_name)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Create a SAS Event Stream Manager Deployment in Which Projects Can Run"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": 3,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def create_ESM_cluster_deployment():\n",
+ " deployment_alread_exists_error = 'A deployment named \"' + chosen_deployment_name + '\" already exists'\n",
+ " deployment_body = {\n",
+ " \"name\": chosen_deployment_name,\n",
+ " \"type\": \"cluster\"\n",
+ " }\n",
+ " response = requests.post(server + '/SASEventStreamManager/deployment',\n",
+ " data=json.dumps(deployment_body), headers=headers, verify=False)\n",
+ " if deployment_alread_exists_error in response.text:\n",
+ " print (\"Deployment \" + chosen_deployment_name + \" already exists, we can proceed to start the projects\")\n",
+ " elif response.status_code != 201:\n",
+ " print(\"error creating ESM cluster deployment\", response.text)\n",
+ "\n",
+ "create_ESM_cluster_deployment()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "# Loop Through All Project XML Files in the `xml_projects` Folder and Perform the Following Steps\n",
+ "1. Check whether the projects have already been imported to SAS Event Stream Processing Studio.\n",
+ "2. If a project has been previously imported, then import it using the next version number. Otherwise, import the project as version 1.\n",
+ "3. Make the projects public so that they are visible to all users.\n",
+ "4. Publish the projects from SAS Event Stream Processing Studio to SAS Event Stream Manager.\n",
+ "5. Synchronize the projects.\n",
+ "6. Create a SAS Event Stream Manager deployment whose type is \"Cluster\".\n",
+ "7. Run the projects in the Kubernetes cluster. This action creates and starts an ESP server for each project."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "def import_and_publish_xml_files():\n",
+ " current_dir = os.getcwd() + \"/xml_projects\"\n",
+ " projects = get_projects()\n",
+ " for file_name in os.listdir(current_dir):\n",
+ " print()\n",
+ " if not file_name.endswith('.xml'): continue\n",
+ " version = 1\n",
+ " project_id = None\n",
+ " xml_file_path = os.path.join(current_dir, file_name)\n",
+ " project_name = get_project_name_from_xml(xml_file_path)\n",
+ " data = open(xml_file_path, \"r\").read()\n",
+ " project_body = {'friendlyName': project_name, 'xml': data}\n",
+ "\n",
+ " if projects:\n",
+ " project_that_matches_xml_name = None\n",
+ " for project in projects:\n",
+ " if project['friendlyName'] == project_name:\n",
+ " project_that_matches_xml_name = project\n",
+ " break\n",
+ "\n",
+ " if project_that_matches_xml_name:\n",
+ " print(project_name + ' is already imported')\n",
+ " project_id = project_that_matches_xml_name['flowId']\n",
+ " version = get_next_project_version(project_id)\n",
+ " print('Creating new project version: ' + str(version))\n",
+ "\n",
+ " if not bool(project_id):\n",
+ " print('Importing ' + project_name)\n",
+ " project_id = import_project_to_studio(project_body)\n",
+ " make_project_public(project_id)\n",
+ "\n",
+ " folder_id = publish_project(project_id, project_body, version)\n",
+ "\n",
+ " synchronize_project_for_ESM(folder_id)\n",
+ "\n",
+ " if project_id:\n",
+ " print(file_name + ' successfully published', flush=True)\n",
+ " else:\n",
+ " print(file_name + ' failed to publish', flush=True)\n",
+ "\n",
+ " k8s_project_body = {\"id\": project_id, \"name\": project_name, \"version\": version}\n",
+ " start_k8s_cluster(k8s_project_body)\n",
+ "\n",
+ "def get_project_name_from_xml(xml_file_path):\n",
+ " tree = ET.parse(xml_file_path)\n",
+ " root = tree.getroot()\n",
+ " project_name = root.attrib['name']\n",
+ " return project_name\n",
+ "\n",
+ "\n",
+ "\n",
+ "import_and_publish_xml_files()\n"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": []
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3 (ipykernel)",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.11.2"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/lua_connector.xml b/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/lua_connector.xml
new file mode 100644
index 0000000..1ae51ba
--- /dev/null
+++ b/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/lua_connector.xml
@@ -0,0 +1,87 @@
+
+
+
+ fsduser
+ 1715251562335
+ fsduser
+ 1715251567920
+ {}
+ Example
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/python_connector.xml b/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/python_connector.xml
new file mode 100644
index 0000000..4a4630d
--- /dev/null
+++ b/IoT/eventStreamProcessing/python/import-and-publish-projects/xml_projects/python_connector.xml
@@ -0,0 +1,71 @@
+
+
+
+ {"cq1":{"Source":{"x":-435,"y":-400}}}
+ fsduser
+ 1715251658799
+ fsduser
+ 1715251663688
+ Example
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
\ No newline at end of file
diff --git a/README.md b/README.md
index 86ab318..4578613 100644
--- a/README.md
+++ b/README.md
@@ -4,14 +4,15 @@ This is a preview repository containing examples showing the capabilities of SAS
The APIs are group into the following categories:
-| API Category | Description |
-| ------ | ------ |
-| Visualization | Provide access to reports and report images |
-| Compute | Act on SAS compute and analytic servers, including Cloud Analytic Services (CAS) |
-| Text Analytics | Provide analysis and categorization of text documents |
-| Data Management | Enable data manipulation and data quality operations on data sources |
-| Decision Management | Provide access to machine scoring and business rules |
-| Core Services | Provide operations for shared resources such as files and folders |
+| API Category | Description |
+|---------------------|-----------------------------------------------------------------------------------|
+| Visualization | Provides access to reports and report images |
+| Compute | Acts on SAS compute and analytic servers, including Cloud Analytic Services (CAS) |
+| Text Analytics | Provides analysis and categorization of text documents |
+| Data Management | Enables data manipulation and data quality operations on data sources |
+| Decision Management | Provides access to machine scoring and business rules |
+| Core Services | Provides operations for shared resources such as files and folders ||
+| IoT | Provides operations for IoT analytics and Event Stream Processing ||
Within each grouping will be a folder for SAS and User contributions (we encourage external participation).