Jump to content

Search the Community

Showing results for tags '.net'.

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • General
    • General Discussion
    • Artificial Intelligence
    • DevOpsForum News
  • DevOps & SRE
    • DevOps & SRE General Discussion
    • Databases, Data Engineering & Data Science
    • Development & Programming
    • CI/CD, GitOps, Orchestration & Scheduling
    • Docker, Containers, Microservices, Serverless & Virtualization
    • Infrastructure-as-Code
    • Kubernetes & Container Orchestration
    • Linux
    • Logging, Monitoring & Observability
    • Security, Governance, Risk & Compliance
  • Cloud Providers
    • Amazon Web Services
    • Google Cloud Platform
    • Microsoft Azure

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Website URL


LinkedIn Profile URL


About Me


Cloud Platforms


Cloud Experience


Development Experience


Current Role


Skills


Certifications


Favourite Tools


Interests

Found 15 results

  1. MattKC painstakingly ports .NET to Windows 95 from Windows 98, enabling many applications that would not otherwise work. View the full article
  2. Fargate is a serverless compute engine for containers that works with both Amazon ECS and Amazon EKS. With AWS Fargate, we can run applications without managing servers (official information page). In this post, we will take a step-by-step approach to deploying and running a .NET Core Web API application on AWS Fargate Service. View the full article
  3. Dynatrace has extended the Application Security Module it provides for its observability platform to protect against vulnerabilities in runtime environments, including the Java Virtual Machine (JVM), Node.js runtime and .NET CLR. In addition, Dynatrace has extended its support to applications built using the Go programming language. The Dynatrace Application Security Module leverages existing Dynatrace tracing […] View the full article
  4. Porting Assistant for .NET now supports assessment and porting of legacy .NET Framework applications written in VB.NET language. With this release, Porting Assistant will translate VB.NET class libraries, web APIs, and console applications to .NET Core 3.1, .NET 5, or .NET 6 to simplify the modernization of legacy .NET Framework applications written in VB.NET . Developers can use the Porting Assistant for .NET standalone tool or Porting Assistant for .NET Visual Studio IDE extension to modernize their legacy VB.NET applications. Support for VB.NET is added in addition to existing support for assessment and porting of C# based .NET Framework applications. View the full article
  5. We are happy to announce the general availability of the new streamlined deployment experience for .NET applications. With sensible defaults for all deployment settings, you can now get your .NET application up and running in just one click, or with a few easy steps - without needing deep expertise in AWS. You will receive recommendations on the optimal compute for your application, giving you more confidence in your initial deployments. You can find it in the AWS Toolkit for Visual Studio using the new “Publish to AWS” wizard. It is also available via the .NET CLI by installing AWS Deploy Tool for .NET. Key capabilities: Compute recommendations - get the compute recommendations and learn which AWS compute is best suited for your application. Dockerfile generation – the Dockerfile will be auto-generated if required by your chosen AWS compute. Auto packaging and deployment – your application will be built and packaged as required by the chosen AWS compute. The tooling will provision the necessary infrastructure and deploy your application using AWS CDK. Repeatable and shareable deployments – you can generate well organized and documented AWS CDK deployment projects and start modifying them to fit your specific use-case. Then version control them and share with your team for repeatable deployments. CI/CD integration – turn off the interactive features and use different deployment settings to push the same application bundle to different environments. Help with learning AWS CDK for .NET! – gradually learn the underlying AWS tools that it is built on, such as the AWS CDK. View the full article
  6. Developers can now use the AWS Encryption SDK for .NET to help protect their data. This open-source release makes it easier for developers to encrypt and decrypt their data when building applications using the .NET developer platform. View the full article
  7. AWS App2Container (A2C) is a command-line tool for modernizing .NET and Java applications into containerized applications. A2C analyzes and builds an inventory of all applications running in virtual machines, on-premises or in the cloud. You simply select the application you want to containerize, and A2C packages the application artifact and identified dependencies into container images, configures the network ports, and generates the ECS task and Kubernetes pod definitions. View the full article
  8. Build5Nines Weekly provides your go-to source to keep up-to-date on all the latest Microsoft Azure news and updates. Included within Build5Nines Weekly newsletter are blog articles, podcasts, videos, and more from Microsoft and the greater community over the past week. Be sure to subscribe to Build5Nines Weekly to get the newsletter in your email every week and […] The article Latest Cloud News: .NET 5 Released, Apple Silicon M1 CPU, and more! (November 12, 2020 – Build5Nines Weekly) appeared first on Build5Nines. View the full article
  9. Porting Assistant for .NET is now open source. Users can now extend the data set with new recommendations for assessment and use the extended data set to scan their project for incompatibilities. Users can also review and offer suggestions on exiting data sets. Users can actively participate in the development process and bring their experience and expert knowledge to the tool. They can review open issues in GitHub, comment on any of the issues that they are familiar with, make suggestions, ask questions, or open new issues if they like to start a new conversation. Source code for compatibility analysis component, assessment APIs and the data set used for the assessment are released under Apache 2.0 license View the full article
  10. As companies implement DevOps practices, standardizing the deployment of continuous integration and continuous deployment (CI/CD) pipelines is increasingly important. Your developer team may not have the ability or time to create your own CI/CD pipelines and processes from scratch for each new project. Additionally, creating a standardized DevOps process can help your entire company ensure that all development teams are following security and governance best practices... View the full article
  11. Porting Assistant for .NET can now support customers to migrate their legacy .NET framework applications to newly released .NET 5. .NET 5 is a major release with a broad set of features and improvements. With this updated release of Porting Assistant for .NET customers can analyze and port their .NET framework applications to either new release of .NET 5 or .NET Core 3.1. View the full article
  12. This is the second post in a two-part series in which you migrate and containerize a modernized enterprise application. In Part 1, we walked you through a step-by-step approach to re-architect a legacy ASP.NET MVC application and ported it to .NET Core Framework. In this post, you will deploy the previously re-architected application to Amazon Elastic Container Service (Amazon ECS) and run it as a task with AWS Fargate. Overview of solution In the first post, you ported the legacy MVC ASP.NET application to ASP.NET Core, you will now modernize the same application as a Docker container and host it in the ECS cluster. The following diagram illustrates this architecture. You first launch a SQL Server Express RDS (1) instance and create a Cycle Store database on that instance with tables of different categories and subcategories of bikes. You use the previously re-architected and modernized ASP.NET Core application as a starting point for this post, this app is using AWS Secrets Manager (2) to fetch database credentials to access Amazon RDS instance. In the next step, you build a Docker image of the application and push it to Amazon Elastic Container Registry (Amazon ECR) (3). After this you create an ECS cluster (4) to run the Docker image as a AWS Fargate task. Prerequisites For this walkthrough, you should have the following prerequisites: An AWS account. An AWS Identity and Access Management (IAM) user with AdministratorAccess. For instructions, see this deep link to create an IAM role with Administrator access. AWS Tools for Windows. Please follow these steps to set up your AWS profile. .NET Core 3.1 SDK installed. For instructions, see Download .NET Core 3.1. Microsoft Visual Studio 2017 or later (Visual Studio code can be an alternative). SQL Server Management Studio to connect to SQL Server instance. ASP.NET application development experience. This post implements the solution in Region us-east-1. Source Code Clone the source code from the GitHub repo. The source code folder contains the re-architected source code and the AWS CloudFormation template to launch the infrastructure, and Amazon ECS task definition. Setting up the database server To make sure that your database works out of the box, you use a CloudFormation template to create an instance of Microsoft SQL Server Express and AWS Secrets Manager secrets to store database credentials, security groups, and IAM roles to access Amazon Relational Database Service (Amazon RDS) and Secrets Manager. This stack takes approximately 15 minutes to complete, with most of that time being when the services are being provisioned. On the AWS CloudFormation console, choose Create stack. For Prepare template, select Template is ready. For Template source, select Upload a template file. Upload SqlServerRDSFixedUidPwd.yaml, which is available in the GitHub repo. Choose Next. For Stack name, enter SQLRDSEXStack. Choose Next. Keep the rest of the options at their default. Select I acknowledge that AWS CloudFormation might create IAM resources with custom names. Choose Create stack. When the status shows as CREATE_COMPLETE, choose the Outputs tab. Record the value for the SQLDatabaseEndpoint key. Connect the database from the SQL Server Management Studio with the following credentials:User id: DBUserand Password:DBU$er2020 Setting up the CYCLE_STORE database To set up your database, complete the following steps: On the SQL Server Management console, connect to the DB instance using the ID and password you defined earlier. Under File, choose New. Choose Query with Current Connection.Alternatively, choose New Query from the toolbar. Open CYCLE_STORE_Schema_data.sql from the GitHub repository and run it. This creates the CYCLE_STORE database with all the tables and data you need. Setting up the ASP.NET MVC Core application To set up your ASP.NET application, complete the following steps: Open the re-architected application code that you cloned from the GitHub repo. The Dockerfile added to the solution enables Docker support. Open the appsettings.Development.json file and replace the RDS endpoint present in the ConnectionStrings section with the output of the AWS CloudFormation stack without the port number which is :1433 for SQL Server. The ASP.NET application should now load with bike categories and subcategories. See the following screenshot. Setting up Amazon ECR To set up your repository in Amazon ECR, complete the following steps: On the Amazon ECR console, choose Repositories. Choose Create repository. For Repository name, enter coretoecsrepo. Choose Create repository Copy the repository URI to use later. Select the repository you just created and choose View push commands. In the folder where you cloned the repo, navigate to the AdventureWorksMVCCore.Web folder. In the View push commands popup window, complete steps 1–4 to push your Docker image to Amazon ECR. The following screenshot shows completion of Steps 1 and 2 and ensure your working directory is set to AdventureWorksMVCCore.Web as below. The following screenshot shows completion of Steps 3 and 4. Setting up Amazon ECS To set up your ECS cluster, complete the following steps: On the Amazon ECS console, choose Clusters. Choose Create cluster. Choose the Networking only cluster template. Name your cluster cycle-store-cluster. Leave everything else as its default. Choose Create cluster. Select your cluster. Choose Task Definitions and choose Create new Task Definition. On the Select launch type compatibility page choose FARGATE and click Next step. On the Configure task and container definitions page, scroll to the bottom of the page and choose Configure via JSON. In the text area, enter the task definition JSON (task-definition.json) provided in the GitHub repo. Make sure to replace [YOUR-AWS-ACCOUNT-NO] in task-definition.json with your AWS account number on the line number 44, 68 and 71. The task definition file assumes that you named your repository as coretoecsrepo. If you named it something else, modify this file accordingly. It also assumes that you are using us-east-1 as your default region, so please consider replacing the region in the task-definition.json on line number 15 and 44 if you are not using us-east-1 region. Choose Save. On the Task Definitions page, select cycle-store-td. From the Actions drop-down menu, choose Run Task. Choose Launch type is equal to Fargate. Choose your default VPC as Cluster VPC. Select at least one Subnet. Choose Edit Security Groups and select ECSSecurityGroup (created by the AWS CloudFormation stack). Choose Run Task Running your application Choose the link under the task and find the public IP. When you navigate to the URL http://your-public-ip, you should see the .NET Core Cycle Store web application user interface running in Amazon ECS. See the following screenshot. Cleaning up To avoid incurring future charges, delete the stacks you created for this post. On the AWS CloudFormation console, choose Stacks. Select SQLRDSEXStack. In the Stack details pane, choose Delete. Conclusion This post concludes your journey towards modernizing a legacy enterprise MVC ASP.NET web application using .NET Core and containerizing using Amazon ECS using the AWS Fargate compute engine on a Linux container. Portability to .NET Core helps you run enterprise workload without any dependencies on windows environment and AWS Fargate gives you a way to run containers directly without managing any EC2 instances and giving you full control. Additionally, couple of recent launched AWS tools in this area. Insight and assistance for porting from .NET Framework to .NET Core https://aws.amazon.com/porting-assistant-dotnet/ Containerize and migrate existing applications https://aws.amazon.com/app2container/ About the Author Saleha Haider is a Senior Partner Solution Architect with Amazon Web Services. Pratip Bagchi is a Partner Solutions Architect with Amazon Web Services. View the full article
  13. Happy Friday! I’m back with a collection of posts from you, the wonderful Azure DevOps community. We range from how we use Azure DevOps to the fullest, to code formatting and security, to managing Azure Lighthouse. Enforce .NET code style in CI with dotnet format Gérald shows us how to use the dotnet format command in an Azure Pipeline to check code style. Using a Lighthouse Service Principal within Azure DevOps Now, this is cool – Thijs explains how they use Azure DevOps to manage Azure Sentinel to manage multiple environments. Automate Azure DevOps code security analysis with the Microsoft Security Code Analysis extensions Tobias shares an update to his post on code security extensions in Azure Pipelines. Most Teams Aren’t Using Azure DevOps to Its Full Potential Are you using Azure DevOps to the fullest? Christina shares her thoughts on the matter. If you’ve written an article about Azure DevOps or find some great content about DevOps on Azure, please share it with the #AzureDevOps hashtag on Twitter! The post Top Stories from the Microsoft DevOps Community – 2020.08.28 appeared first on Azure DevOps Blog. View the full article
  14. The trend of building AWS Serverless applications using AWS Lambda is increasing at an ever-rapid pace. Common use cases for AWS Lambda include data processing, real-time file processing, and extract, transform, and load (ETL) for data processing, web backends, internet of things (IoT) backends, and mobile backends. Lambda natively supports languages such as Java, Go, PowerShell, Node.js, C#, Python, and Ruby. It also provides a Runtime API that allows you to use any additional programming languages to author your functions. .NET framework occupies a significant footprint in the technology landscape of enterprises. Nowadays, enterprise customers are modernizing .NET framework applications to .NET Core using AWS Serverless (Lambda). In this journey, you break down a large monolith service into multiple smaller independent and autonomous microservices using.NET Core Lambda functions When you have several microservices running in production, a change management strategy is key for business agility and time-to-market changes. The change management of .NET Core Lambda functions translates to how well you implement an automated CI/CD pipeline using AWS CodePipeline. In this post, you see two approaches for implementing CI/CD for .NET Core Lambda functions: creating a pipeline with either two or three stages. Creating a pipeline with two stages In this approach, you define the pipeline in CodePipeline with two stages: AWS CodeCommit and AWS CodeBuild. CodeCommit is the fully-managed source control repository that stores the source code for .NET Core Lambda functions. It triggers CodeBuild when a new code change is published. CodeBuild defines a compute environment for the build process. It builds the .NET Core Lambda function and creates a deployment package (.zip). Finally, CodeBuild uses AWS extensions for Dotnet CLI to deploy the Lambda packages (.zip) to the Lambda environment. The following diagram illustrates this architecture. CodePipeline with CodeBuild and CodeCommit stages. Creating a pipeline with three stages In this approach, you define the pipeline with three stages: CodeCommit, CodeBuild, and AWS CodeDeploy. CodeCommit stores the source code for .NET Core Lambda functions and triggers CodeBuild when a new code change is published. CodeBuild defines a compute environment for the build process and builds the .NET Core Lambda function. Then CodeBuild invokes the CodeDeploy stage. CodeDeploy uses AWS CloudFormation templates to deploy the Lambda function to the Lambda environment. The following diagram illustrates this architecture. CodePipeline with CodeCommit, CodeBuild and CodeDeploy stages. Solution Overview In this post, you learn how to implement an automated CI/CD pipeline using the first approach: CodePipeline with CodeCommit and CodeBuild stages. The CodeBuild stage in this approach implements the build and deploy functionalities. The high-level steps are as follows: Create the CodeCommit repository. Create a Lambda execution role. Create a Lambda project with .NET Core CLI. Change the Lambda project configuration. Create a buildspec file. Commit changes to the CodeCommit repository. Create your CI/CD pipeline. Complete and verify pipeline creation. For the source code and buildspec file, see the GitHub repo. Prerequisites Before you get started, you need the following prerequisites: MacOS, Linux, or up-to-date Windows 10 Visual Studio 2019 with latest updates (if on Windows) .NET Core 3.1 AWS extensions for Dotnet CLI Amazon.Lambda.Templates package Creating a CodeCommit repository You first need a CodeCommit repository to store the Lambda project source code. 1. In the Repository settings section, for Repository name, enter a name for your repository. 2. Choose Create. 3. Initialize this repository with a markdown file (readme.md). You need this markdown file to create documentation about the repository. 4. Set up an AWS Identity and Access Management (IAM) credential to CodeCommit. Alternatively, you can set up SSH-based access. For instructions, see Setup for HTTPS users using Git credentials and Setup steps for SSH connections to AWS CodeCommit repositories on Linux, MacOS, or Unix. You need this to work with the CodeCommit repository from the development environment. 5. Clone the CodeCommit repository to a local folder. Proceed to the next step to create an IAM role for Lambda execution. Creating a Lambda execution role Every Lambda function needs an IAM role for execution. Create an IAM role for Lambda execution with the appropriate IAM policy, if it doesn’t exist already. You’re now ready to create a Lambda function project using .NET Core Command Line Interface (CLI). Creating a Lambda function project You have multiple options for creating .NET Core Lambda function projects, such as using Visual Studio 2019, Visual Studio Code, and .NET Core CLI. In this post, you use .NET Core CLI. By default, .NET Core CLI doesn’t support Lambda projects. You need the Amazon.Lambda.Templates nuget package to create your project. Install the nuget package Amazon.Lambda.Templates to have all the Amazon Lambda project templates in the development environment. See the following CLI Command. dotnet new -i Amazon.Lambda.Templates::* Verify the installation with the following CLI Command. dotnet new You should see the following output reflecting the presence of various Lambda templates in the development environment. You also need to install AWS extensions for Dotnet Lambda CLI to deploy and invoke Lambda functions from the terminal or command prompt. To install the extensions, enter the following CLI Commands. dotnet tool install -g Amazon.Lambda.Tools dotnet tool update -g Amazon.Lambda.Tools You’re now ready to create a Lambda function project in the development environment. Navigate to the root of the cloned CodeCommit repository (which you created in the previous step). Create the Lambda function by entering the following CLI Command. dotnet new lambda.EmptyFunction --name Dotnetlambda4 --profile default --region us-east-1 After you create your Lambda function project, you need to make some configuration changes. Changing the Lambda function project configuration When you create a .NET Core Lambda function project, it adds the configuration file aws-lambda-tools-defaults.json at the root of the project directory. This file holds the various configuration parameters for Lambda execution. You want to make sure that the function role is set to the IAM role you created earlier, and that the profile is set to default. The updated aws-lambda-tools-defaults.json file should look like the following code: { "Information": [ "This file provides default values for the deployment wizard inside Visual Studio and the AWS Lambda commands added to the .NET Core CLI.", "To learn more about the Lambda commands with the .NET Core CLI execute the following command at the command line in the project root directory.", "dotnet lambda help", "All the command line options for the Lambda command can be specified in this file." ], "profile": "default", "region": "us-east-1", "configuration": "Release", "framework": "netcoreapp3.1", "function-runtime": "dotnetcore3.1", "function-memory-size": 256, "function-timeout": 30, "function-handler": "Dotnetlambda4::Dotnetlambda4.Function::FunctionHandler", "function-role": "arn:aws:iam::awsaccountnumber:role/testlambdarole" } After you update your project configuration, you’re ready to create the buildspec.yml file. Creating a buildspec file As a prerequisite to configuring the CodeCommit stage, you created a Lambda function project. For the CodeBuild stage, you need to create a buildspec file. Create a buildspec.yml file with the following definition and save it at the root of the CodeCommit directory: version: 0.2 env: variables: DOTNET_ROOT: /root/.dotnet secrets-manager: AWS_ACCESS_KEY_ID_PARAM: CodeBuild:AWS_ACCESS_KEY_ID AWS_SECRET_ACCESS_KEY_PARAM: CodeBuild:AWS_SECRET_ACCESS_KEY phases: install: runtime-versions: dotnet: 3.1 pre_build: commands: - echo Restore started on `date` - export PATH="$PATH:/root/.dotnet/tools" - pip install --upgrade awscli - aws configure set profile $Profile - aws configure set region $Region - aws configure set aws_access_key_id $AWS_ACCESS_KEY_ID_PARAM - aws configure set aws_secret_access_key $AWS_SECRET_ACCESS_KEY_PARAM - cd Dotnetlambda4 - cd src - cd Dotnetlambda4 - dotnet clean - dotnet restore build: commands: - echo Build started on `date` - dotnet new -i Amazon.Lambda.Templates::* - dotnet tool install -g Amazon.Lambda.Tools - dotnet tool update -g Amazon.Lambda.Tools - dotnet lambda deploy-function "Dotnetlambda4" --function-role "arn:aws:iam::yourawsaccount:role/youriamroleforlambda" --region "us-east-1" You’re now ready to commit your changes to the CodeCommit repository. Committing changes to the CodeCommit repository To push changes to your CodeCommit repository, enter the following git commands. git add --all git commit –a –m “Initial Comment” git push After you commit the changes, you can create your CI/CD pipeline using CodePipeline. Creating a CI/CD pipeline To create your pipeline with a CodeCommit and CodeBuild stage, complete the following steps: In the Pipeline settings section, for Pipeline name, enter a name. For Service role, select New service role. For Role name, use the auto-generated name. Select Allow AWS CodePipeline to create a service role so it can be used with this new pipeline. Choose Next. In the Source section, for Source provider, choose AWS CodeCommit. For Repository name, choose your repository. For Branch name, choose your branch. For Change detection options, select Amazon CloudWatch Events. Choose Next. In the Build section, for Build provider, choose AWS CodeBuild. For Environment image, choose Managed image. For Operating system, choose Ubuntu. For Image, choose aws/codebuild/standard:4.0. For Image version, choose Always use the latest image for this runtime version. CodeBuild needs to assume an IAM service role to get the required privileges for successful build operation.Create a new service role for the CodeBuild project. Attach the following IAM policy to the role: { "Version": "2012-10-17", "Statement": [ { "Sid": "SecretManagerRead", "Effect": "Allow", "Action": [ "secretsmanager:GetRandomPassword", "secretsmanager:GetResourcePolicy", "secretsmanager:UntagResource", "secretsmanager:GetSecretValue", "secretsmanager:DescribeSecret", "secretsmanager:ListSecretVersionIds", "secretsmanager:ListSecrets", "secretsmanager:TagResource" ], "Resource": "*" } ] } You now need to define the compute and environment variables for CodeBuild. For Compute, select your preferred compute. For Environment variables, enter two variables. For Region, enter your preferred Region. For Profile, Enter Value as default. This allows the environment to use the default AWS profile in the build process. To set up an AWS profile, the CodeBuild environment needs AccessKeyId and SecretAccessKey. As a best practice, configure AccessKeyId and SecretAccessKey as secrets in AWS Secrets Manager and reference it in buildspec.yml. On the Secrets Manager console, choose Store a new secret. For Select secret type, select Other type of secrets. Configure secrets AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. For the encryption key, choose DefaultEncryptionKey. Choose Next. For Secret name, enter CodeBuild. Leave the rest of selections as default and choose Store. In the Add deploy stage section, choose Skip deploy stage. Completing and verifying your pipeline After you save your pipeline, push the code changes of the Lambda function from the local repository to the remote CodeCommit repository. After a few seconds, you should see the activation of the CodeCommit stage and transition to CodeBuild stage. Pipeline creation can take up to a few minutes. You can verity your pipeline on the CodePipeline console. This should deploy the Lambda function changes to the Lambda environment. Cleaning up If you no longer need the following resources, delete them to avoid incurring further charges: CodeCommit repository CodePipeline project CodeBuild project IAM role for Lambda execution Lambda function Conclusion In this post, you implemented an automated CI/CD for .NET Core Lambda functions using two stages of CodePipeline: CodeCommit and CodeBuild. You can apply this solution to your own use cases. About the author Sundararajan Narasiman works as Senior Partner Solutions Architect with Amazon Web Services. View the full article
  15. Tens of thousands of .NET applications are running across the world, many of which are ASP.NET web applications. This number becomes interesting when you consider that the .NET framework, as we know it, will be changing significantly. The current release schedule for .NET 5.0 is November 2020, and going forward there will be just one .NET that you can use to target multiple platforms like Windows and Linux. This is important because those .NET applications running in version 4.8 and lower can’t automatically upgrade to this new version of .NET. This is because .NET 5.0 is based on .NET Core and thus has breaking changes when trying to upgrade from an older version of .NET. This is an important step in the .NET Eco-sphere because it enables .NET applications to move beyond the Windows world. However, this also means that active applications need to go through a refactoring before they can take advantage of this new definition. One choice for this refactoring is to wait until the new version of .NET is released and start the refactoring process at that time. The second choice is to get an early start and start converting your applications to .NET Core v3.1 so that the migration to .NET 5.0 will be smoother. This post demonstrates an approach of migrating an ASP.NET MVC (Model View Controller) web application using Entity Framework 6 to and ASP.NET Core with Entity Framework Core. This post shows steps to modernize a legacy enterprise MVC ASP.NET web application using .NET core along with converting Entity Framework to Entity Framework Core. Overview of the solution The first step that you take is to get an Asp.NET MVC application and its required database server up and working in your AWS environment. We take this approach so you can run the application locally to see how it works. You first set up the database, which is SQL Server running in Amazon Relational Database Service (Amazon RDS). Amazon RDS provides a managed SQL Server experience. After you define the database, you set up schema and data. If you already have your own SQL Server instance running, you can also load data there if desired; you simply need to ensure your connection string points to that server rather than the Amazon RDS server you set up in this walk-through. Next you launch a legacy MVC ASP.NET web application that displays lists of bike categories and its subcategories. This legacy application uses Entity Framework 6 to fetch data from database. Finally, you take a step-by-step approach to convert the same use case and create a new ASP.NET Core web application. Here you use Entity Framework Core to fetch data from the database. As a best practice, you also use AWS Secrets Manager to store database login information. Prerequisites For this walkthrough, you should have the following prerequisites: An AWS Account An AWS user with AdministratorAccess (see the instructions on the AWS Identity and Access Management (IAM) console) Access to the following AWS services: Amazon RDS Amazon Simple Storage Service (Amazon S3) Secrets Manager .NET Core 3.1 SDK installed Microsoft Visual Studio 2017 or later (Visual Studio code can be an alternative) SQL Server Management Studio to connect to the SQL Server instance Asp.Net application development experience Setting up the database server For this walk-through, we have provided an AWS CloudFormation template inside the GitHub repository to create an instance of Microsoft SQL Server Express. Which can be downloaded from this link. On the AWS CloudFormation console, choose Create stack. For Prepare template, select Template is ready. For Template source, select Upload a template file. Upload SqlServerRDSFixedUidPwd.yaml and choose Next. For Stack name, enter SQLRDSEXStack and choose Next. Keep the rest of the options at their default. Select I acknowledge that AWS CloudFormation might create IAM resources with custom names. Choose Create stack. When the status shows as CREATE_COMPLETE, choose the Outputs tab and record the value from the SQLDatabaseEndpoint key. Connect the database from the SQL Server Management Studio with the following credentials:User id: DBUserPassword: DBU$er2020 Setting up the CYCLE_STORE database To set up your database, complete the following steps: On the SQL Server Management console, connect to the DB instance using the ID and password you defined earlier. Under File, choose New. Choose Query with Current Connection. Alternatively, choose New Query from the toolbar. Download cycle_store_schema_data.sql and run it. This creates the CYCLE_STORE database with all the tables and data you need. Setting up and validating the legacy MVC application Download the source code from the GitHub repo. Open AdventureWorksMVC_2013.sln and modify the database connection string in the web.config file by replacing the Data Source property value with the server name from your Amazon RDS setup. The ASP.NET application should load with bike categories and subcategories. The following screenshot shows the Unicorn Bike Rentals website after configuration. Now that you have the legacy application running locally, you can look at what it would take to refactor it so that it’s a .NET Core 3.1 application. Two main approaches are available for this: Update in place – You make all the changes within a single code set Move code – You create a new .NET Core solution and move the code over piece by piece For this post, we show the second approach because it means that you have to do less scaffolding. Creating a new MVC Core application To create your new MVC Core application, complete the following steps: Open Visual Studio. From the Get Started page, choose Create a New Project. Choose ASP.NET Core Web Application. For Project name, enter AdventureWorksMVCCore.Web. Add a Location you prefer. For Solution name, enter AdventureWorksMVCCore. Choose Web Application (Model-View-Controller). Choose Create. Make sure that the project is set to use .NET Core 3.1. Choose Build, Build Solution. Press CTRL + Shift + B to make sure the current solution is building correctly. You should get a default ASP.NET Core startup page. Aligning the projects ASP.NET Core MVC is dependent upon the use of a well-known folder structure; a lot of the scaffolding depends upon view source code files being in the Views folder, controller source code files being in the Controllers folder, etc. Some of the non-.NET specific folders are also at the same level, such as css and images. In .NET Core, the expectations are that static content should be in a new construct, the wwwroot folder. This includes Javascript, CSS, and image files. You also need to update the configuration file with the same database connection string that you used earlier. Your first step is to move the static content. In the .NET Core solution, delete all the content created during solution creation.This includes css, js, and lib directories and the favicon.ico file. Copy over the css, favicon, and Images folders from the legacy solution to the wwwroot folder of the new solution.When completed, your .NET Core wwwroot directory should appear like the following screenshot. Open appsettings.Development.json and add a ConnectionStrings section (replace the server with the Amazon RDS endpoint that you have already been using). See the following code:. { "ConnectionStrings": { "DefaultConnection": "Server=sqlrdsdb.xxxxx.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id=DBUser;Password=DBU$er2020;" }, "Logging": { "LogLevel": { "Default": "Information", "Microsoft": "Warning", "Microsoft.Hosting.Lifetime": "Information" } } } Setting up Entity Framework Core One of the changes in .NET Core is around changes to Entity Framework. Entity Framework Core is a lightweight, extensible data access technology. It can act as an object-relational mapper (O/RM) that enables interactions with the database using .NET objects, thus abstracting out much of the database access code. To use Entity Framework Core, you first have to add the packages to your project. In the Solution Explorer window, choose the project (right-click) and choose Manage Nuget packages… On the Browse tab, search for the latest stable version of these two Nuget packages.You should see a screen similar to the following screenshot, containing: Microsoft.EntityFrameworkCore.SqlServer Microsoft.EntityFrameworkCore.Tools After you add the packages, the next step is to generate database models. This is part of the O/RM functionality; these models map to the database tables and include information about the fields, constraints, and other information necessary to make sure that the generated models match the database. Fortunately, there is an easy way to generate those models. Complete the following steps: Open Package Manager Console from Visual Studio. Enter the following code (replace the server endpoint): Scaffold-DbContext "Server= sqlrdsdb.xxxxxx.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id= DBUser;Password= DBU`$er2020;" Microsoft.EntityFrameworkCore.SqlServer -OutputDir Models The ` in the password right before the $ is the escape character. You should now have a Context and several Model classes from the database stored within the Models folder. See the following screenshot. Open the CYCLE_STOREContext.cs file under the Models folder and comment the following lines of code as shown in the following screenshot. You instead take advantage of the middleware to read the connection string in appsettings.Development.json that you previously configured. if (!optionsBuilder.IsConfigured) { #warning To protect potentially sensitive information in your connection string, you should move it out of source code. See http://go.microsoft.com/fwlink/?LinkId=723263 for guidance on storing connection strings. optionsBuilder.UseSqlServer( "Server=sqlrdsdb.cph0bnedghnc.us-east-1.rds.amazonaws.com; " + "Database=CYCLE_STORE;User Id= DBUser;Password= DBU$er2020;"); } Open the startup.cs file and add the following lines of code in the ConfigureServices method. You need to add reference of AdventureWorksMVCCore.Web.Models and Microsoft.EntityFrameworkCore in the using statement. This reads the connection string from the appSettings file and integrates with Entity Framework Core. services.AddDbContext<CYCLE_STOREContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection"))); Setting up a service layer Because you’re working with an MVC application, that application control logic belongs in the controller. In the previous step, you created the data access layer. You now create the service layer. This service layer is responsible for mediating communication between the controller and the data access layer. Generally, this is where you put considerations such as business logic and validation. Setting up the interfaces for these services follows the dependency inversion and interface segregation principles. Create a folder named Service under the project directory. Create two subfolders, Interface and Implementation, under the new Service folder. Add a new interface, ICategoryService, to the Service\Interface folder. Add the following code to that interface using AdventureWorksMVCCore.Web.Models; using System.Collections.Generic; namespace AdventureWorksMVCCore.Web.Service.Interface { public interface ICategoryService { List<ProductCategory> GetCategoriesWithSubCategory();” } } Add a new service file, CategoryService, to the Service/Implementation folder. Create a class file CategoryService.cs and implement the interface you just created with the following code: using AdventureWorksMVCCore.Web.Models; using AdventureWorksMVCCore.Web.Service.Interface; using Microsoft.EntityFrameworkCore; using System.Collections.Generic; using System.Linq; namespace AdventureWorksMVCCore.Web.Service.Implementation { public class CategoryService : ICategoryService { private readonly CYCLE_STOREContext _context; public CategoryService(CYCLE_STOREContext context) { _context = context; } public List<ProductCategory> GetCategoriesWithSubCategory() { return _context.ProductCategory .Include(category => category.ProductSubcategory) .ToList(); } } } Now that you have the interface and instantiation completed, the next step is to add the dependency resolver. This adds the interface to the application’s service collection and acts as a map on how to instantiate that class when it’s injected into a class constructor. To add this mapping, open the startup.cs file and add the following line of code below where you added DbContext: services.TryAddScoped<ICategoryService, CategoryService>(); You may also need to add the following references: using AdventureWorksMVCCore.Web.Service.Implementation; using AdventureWorksMVCCore.Web.Service.Interface; using Microsoft.Extensions.DependencyInjection; using Microsoft.Extensions.DependencyInjection.Extensions; Setting up view components In this section, you move the UI to your new ASP.NET Core project. In ASP.NET Core, the default folder structure for managing views is different that it was in ASP.NET MVC. Also, the formatting of the Razor files is slightly different. Under Views, Shared, create a folder called Components. Create a sub folder called Header. In the Header folder, create a new view called Default.cshtml and enter the following code: <div method="post" asp-action="header" asp-controller="home"> <div id="hd"> <div id="doc2" class="yui-t3 wrapper"> <table> <tr> <td> <h2 class="banner"> <a href="@Url.Action("Default","Home")" id="LnkHome"> <img src="/Images/logo.png" style="width:125px;height:125px" alt="ComponentOne" /> </a> </h2> </td> <td class="hd-header"><h2 class="banner">Unicorn Bike Rentals</h2></td> </tr> </table> </div> </div> </div> Create a class within the Header folder called HeaderLayout.cs and enter the following code: using Microsoft.AspNetCore.Mvc; namespace AdventureWorksMVCCore.Web { public class HeaderViewComponent : ViewComponent { public IViewComponentResult Invoke() { return View(); } } } You can now create the content view component, which shows bike categories and subcategories. Under Views, Shared, Components, create a folder called Content Create a class ContentLayoutModel.cs and enter the following code: using AdventureWorksMVCCore.Web.Models; using System.Collections.Generic; namespace AdventureWorksMVCCore.Web.Views.Components { public class ContentLayoutModel { public List<ProductCategory> ProductCategories { get; set; } } } In this folder, create a view Default.cshtml and enter the following code: @model AdventureWorksMVCCore.Web.Views.Components.ContentLayoutModel <div method="post" asp-action="footer" asp-controller="home"> <div class="content"> <div class="footerinner"> <div id="PnlExpFooter"> <div> @foreach (var category in Model.ProductCategories) { <div asp-for="@category.Name" class=@($"{category.Name}Menu")> <h1> <b>@category.Name</b> </h1> <ul class=@($"{category.Name}List")> @foreach (var subCategory in category.ProductSubcategory.ToList()) { <li>@subCategory.Name</li> } </ul> </div> } </div> </div> </div> </div> </div> Create a class ContentLayout.cs and enter the following code: using AdventureWorksMVCCore.Web.Models; using AdventureWorksMVCCore.Web.Service.Interface; using Microsoft.AspNetCore.Mvc; namespace AdventureWorksMVCCore.Web.Views.Components { public class ContentViewComponent : ViewComponent { private readonly CYCLE_STOREContext _context; private readonly ICategoryService _categoryService; public ContentViewComponent(CYCLE_STOREContext context, ICategoryService categoryService) { _context = context; _categoryService = categoryService; } public IViewComponentResult Invoke() { ContentLayoutModel content = new ContentLayoutModel(); content.ProductCategories = _categoryService.GetCategoriesWithSubCategory(); return View(content); } } } The website layout is driven by _Layout.cshtml file. To render the header and portal the way you want, modify _Layout.cshtml and replace the existing code with the following code: <!DOCTYPE html> <html lang="en"> <head> <meta name="viewport" content="width=device-width" /> <title>Core Cycles Store</title> <link rel="apple-touch-icon" sizes="180x180" href="favicon/apple-touch-icon.png" /> <link rel="icon" type="image/png" href="favicon/favicon-32x32.png" sizes="32x32" /> <link rel="icon" type="image/png" href="favicon/favicon-16x16.png" sizes="16x16" /> <link rel="manifest" href="favicon/manifest.json" /> <link rel="mask-icon" href="favicon/safari-pinned-tab.svg" color="#503b75" /> <link href="@Url.Content("~/css/StyleSheet.css")" rel="stylesheet" /> </head> <body class='@ViewBag.BodyClass' id="body1"> @await Component.InvokeAsync("Header"); <div id="doc2" class="yui-t3 wrapper"> <div id="bd"> <div id="yui-main"> <div class="content"> <div> @RenderBody() </div> </div> </div> </div> </div> </body> </html> Upon completion your directory should look like the following screenshot. Modifying the index file In this final step, you modify the Home, Index.cshtml file to hold this content ViewComponent: @{ ViewBag.Title = "Core Cycles"; Layout = "~/Views/Shared/_Layout.cshtml"; } <div id="homepage" class=""> <div class="content-mid"> @await Component.InvokeAsync("Content"); </div> </div> You can now build the solution. You should have an MVC .NET Core 3.1 application running with data from your database. The following screenshot shows the website view. Securing the database user and password The CloudFormation stack you launched also created a Secrets Manager entry to store the CYCLE_STORE database user ID and password. As an optional step, you can use that to retrieve the database user ID and password instead of hard-coding it to ConnectionString. To do so, you can use the AWS Secrets Manager client-side caching library. The dependency package is also available through NuGet. For this post, I use NuGet to add the library to the project. On the NuGet Package Manager console, browse for AWSSDK.SecretsManager.Caching. Choose the library and install it. Follow these steps to also install Newtonsoft.Json. Add a new class ServicesConfiguration to this solution and enter the following code in the class. Make sure all the references are added to the class. This is an extension method so we made the class static: public static class ServicesConfiguration { public static async Task<Dictionary<string, string>> GetSqlCredential(this IServiceCollection services, string secretId) { var credential = new Dictionary<string, string>(); using (var secretsManager = new AmazonSecretsManagerClient(Amazon.RegionEndpoint.USEast1)) using (var cache = new SecretsManagerCache(secretsManager)) { var sec = await cache.GetSecretString(secretId); var jo = Newtonsoft.Json.Linq.JObject.Parse(sec); credential["username"] = jo["username"].ToObject<string>(); credential["password"] = jo["password"].ToObject<string>(); } return credential; } } In appsettings.Development.json, replace the DefaultConnection with the following code: "Server=sqlrdsdb.cph0bnedghnc.us-east-1.rds.amazonaws.com; Database=CYCLE_STORE;User Id=<UserId>;Password=<Password>;" Add the following code in the startup.cs, which replaces the placeholder user ID and password with the value retrieved from Secrets Manager: Dictionary<string, string> secrets = services.GetSqlCredential("CycleStoreCredentials").Result; connectionString = connectionString.Replace("<UserId>", secrets["username"]); connectionString = connectionString.Replace("<Password>", secrets["password"]); services.AddDbContext<CYCLE_STOREContext>(options => options.UseSqlServer(connectionString)); Build the solution again. Cleaning up To avoid incurring future charges, on the AWS CloudFormation console, delete the SQLRDSEXStack stack. Conclusion This post showed the process to modernize a legacy enterprise MVC ASP.NET web application using .NET Core and convert Entity Framework to Entity Framework Core. In Part 2 of this post, we take this one step further to show you how to host this application in Linux containers. About the Author Saleha Haider is a Senior Partner Solution Architect with Amazon Web Services. Pratip Bagchi is a Partner Solutions Architect with Amazon Web Services. View the full article
  • Forum Statistics

    43.1k
    Total Topics
    42.4k
    Total Posts
×
×
  • Create New...