Try our conversational search powered by Generative AI!

Elias Lundmark
Mar 21, 2024
  401
(2 votes)

Keeping local environments in sync with your Cloud environments

We recently announced that we are improving the scalability of SQL databases in DXP Cloud Services, this new architecture also enhances our overall security for SQL databases where we are aiming to harden technical controls to maintain confidentiality and integrity of our customers data. This change had an unintended consequence though – it disallows developers from connecting local development environments directly to SQL databases in DXP Cloud Services. We strongly advise against this practice, while the ease of use and flexibility is great, manually managing and storing connection strings and credentials for service users greatly increases the risk of these credentials falling into the wrong hands, allowing potential attackers to access or modify data.

To avoid these risks, our new architecture disallows direct connections from third-party sources to SQL Servers running in DXP Cloud Services. Instead, you should use the paasportal or the API to export your databases and content to use in your local development environments, which are more secure and reliable methods.

How to export content

Via the paasportal

  1. Navigate to https://paasportal.episerver.net and select the project you wish to export a database from
  2. Navigate to the Troubleshoot tab
  3. In the ‘Export Database’ section, select the environment you wish to export the database from, and how long the paasportal should retain this copy.
  4. Once the export is done, click the database file to download it as .bacpac. These files can then be used to import your database to a local SQL server, or an Azure SQL Server.

Via API with Powershell

  1. Navigate to https://paasportal.episerver.net and generate credentials as described here https://docs.developers.optimizely.com/digital-experience-platform/docs/authentication.
  2. Authenticate woth Connect-EpiCloud, Connect-EpiCloud -ClientKey <ClientKey> -ClientSecret <ClientSecret> -ProjectId <ProjectId>
  3. Start a database export with Start-EpiDatabaseExport, for example Start-EpiDatabaseExport -Environment Integration -DatabaseName epicms -Wait
  4. Fetch the download link for the .bacpac with Get-EpiDatabaseExport

Via the API you can also download BLOBs from the storage account, where Get-EpiStorageContainer allows you to list all storage containers and GetEpiStorageContainerSasLink creates a SAS URI that can be used to download BLOBs. For example,

 Get-EpiStorageContainerSasLink -ProjectId "2372b396-6fd2-40ca-a955-57871fc497c9" `

  -Environment "Integration" `

  -StorageContainer "mysitemedia" `

  -RetentionHours 2

Mar 21, 2024

Comments

Drew Douglas
Drew Douglas Mar 21, 2024 07:55 PM

This change to the accessibility of the SQL instances in the Integration environment is disappointing. Prior to us joining the project Opti Expert Services set up one of our customers with a development model that strongly prefers connecting to the Integration databse when running the solution locally. We've run successfully with local databases, but this change to DXP will require us to change messaging and other systems to keep local databases in sync with backend systems.

Eric
Eric Apr 3, 2024 09:48 PM

Never thought you should connect to DXP db:s at any point at all actually. Using client data should not be needed for development purpose. BUT if you use this and download a database a Disclaimer could be handy that if you download a client db you most likely will be deeling with PI data and therefore might be distributing information as a developer that your company most likely do not like you todo in case of a breach..

it’s crucial to stay informed and understand the ins and outs of personal data before downloading a client database is at least my opinion and if so have scripts ready to remove that information or have a Data Processing Agreement in place.. :) 

Please login to comment.
Latest blogs
Azure AI Language – Extractive Summarisation in Optimizely CMS

In this article, I demonstrate how extractive summarisation, provided by the Azure AI Language platform, can be leveraged to produce a set of summa...

Anil Patel | Apr 26, 2024 | Syndicated blog

Optimizely Unit Testing Using CmsContentScaffolding Package

Introduction Unit tests shouldn't be created just for business logic, but also for the content and rules defined for content creation (available...

MilosR | Apr 26, 2024

Solving the mystery of high memory usage

Sometimes, my work is easy, the problem could be resolved with one look (when I’m lucky enough to look at where it needs to be looked, just like th...

Quan Mai | Apr 22, 2024 | Syndicated blog

Search & Navigation reporting improvements

From version 16.1.0 there are some updates on the statistics pages: Add pagination to search phrase list Allows choosing a custom date range to get...

Phong | Apr 22, 2024

Optimizely and the never-ending story of the missing globe!

I've worked with Optimizely CMS for 14 years, and there are two things I'm obsessed with: Link validation and the globe that keeps disappearing on...

Tomas Hensrud Gulla | Apr 18, 2024 | Syndicated blog

Visitor Groups Usage Report For Optimizely CMS 12

This add-on offers detailed information on how visitor groups are used and how effective they are within Optimizely CMS. Editors can monitor and...

Adnan Zameer | Apr 18, 2024 | Syndicated blog