GithubHelp home page GithubHelp logo

aws-samples / aws-cfn-windows-hpc-template Goto Github PK

View Code? Open in Web Editor NEW
24.0 48.0 16.0 176 KB

This sample CloudFormation template will launch a Windows-based HPC cluster running Windows Server 2012R2 and supporting core infrastructure including VPC, domain controllers and bastion servers.

License: MIT No Attribution

PowerShell 97.63% Shell 2.37%

aws-cfn-windows-hpc-template's Introduction

HPC Pack 2019

This sample, non-production-ready AWS CloudFormation template will launch a Windows-based HPC cluster running Windows Server 2016 and supporting core infrastructure including Amazon VPC and Domain Controllers. © 2021 Amazon Web Services, Inc. or its affiliates. All Rights Reserved. This AWS Content is provided subject to the terms of the AWS Customer Agreement available at http://aws.amazon.com/agreement or other written agreement between Customer and either Amazon Web Services, Inc. or Amazon Web Services EMEA SARL or both.

Why should I use this solution?

You should use this solution if you want to install HPC Pack to manage a cluster of Windows Servers for High Performance Computing.

Architecture

Architecture

Deployment

Deployment Artifacts

Create an S3 bucket to store deployment artifacts. Note the bucket name and region as these will be used when deploying CloudFormation templates. A bucket named hpcpack-2019-[YOUR AWS ACCOUNT ID] should ensure the bucket name is globally unique. You can leave all other S3 settings as default.

Next, create an S3 bucket to store the HPC output artifacts. Note the bucket name and region as these will be used when deploying CloudFormation templates. A bucket named hpcpack-2019-outputs-[YOUR AWS ACCOUNT ID] should ensure the bucket name is globally unique. You can leave all other S3 settings as default.

Next, navigate to the ../hpcpack-2019 directory.

Next, upload the ScriptsForComputeNode2019.zip and ScriptsForHeadNode2019.zip files to your S3 bucket.

Download HPC Pack from Microsoft's website (https://www.microsoft.com/en-us/download/confirmation.aspx?id=101360), rename the file to HPCPack.zip, and upload it to your S3 bucket.

  • You can upload via either the AWS Console or CLI. If you have the AWS CLI set up, the uploading via CLI will upload the file faster. You can execute the following command to do so: aws s3 cp HPCPack.zip s3://hpcpack-2019-[YOUR AWS ACCOUNT ID]

Deploy CloudFormation Templates

Navigate to the CloudFormation dashboard (https://console.aws.amazon.com/cloudformation/home) and choose "Create Stack" and choose the "With new resources" option. Then, click "Upload a template file", navigate to hpcpack-2019/HPCLab/CloudFormation/, choose HPCLab2019.yml, and click Next. Next, enter a Stack Name (preferably something that is easy to remember, like HPC-Pack-Lab). Then, choose parameters that are appropriate for your installation.

For computeNodeInstanceCount, we recommend specifying 2 compute nodes for demonstration purposes.

For keyName, use an EC2 Key pair you own. If you don't have a key pair, create a new EC2 Key Pair by navigating to (https://console.aws.amazon.com/ec2/v2/home?region=us-east-1#KeyPairs:)

Click through the prompts, and deploy the solution. It should take about 30 minutes to fully deploy.

Running Sample Job

  1. Login to head node by opening your preferred RDP client (ie Windows Remote Desktop or Royal TSX). Locate the public IP address of the head node in the EC2 Console (You can also find this in the deployed CloudFormation template Outputs section under publicIP). Login should be HPCLAB\admin. Password is the same password entered at into the CloudFormation parameters at deployment

  2. Within the RDP client, once you're logged into the head node, open HPC Cluster Manager. This will open HPC Pack 2019. You should see the head node and worker nodes under Resource Management. You are able to RDP into the compute nodes if you'd like.

  3. Within HPC Cluster Manager, under Job Management, click Add job from XML

  4. Add C:/cfn/install/Parametric-Job-Prod.xml

  5. Leave all other settings as default. Click Submit. You may need to enter your password you used from the CloudFormation deployment.

  6. Let the job run and when it's finished, navigate to the S3 OUTPUT BUCKET you specified in the CloudFormation deployment parameters. S3 Bucket should be named hpcpack-2019-outputs-[YOUR AWS ACCOUNT ID]. Your output files should be there. Feel free to download the eplustbl.htm to local to view the results in a web browser. You're finished with processing! You can also continue to analyze the data in S3 using Glue, Athena, and SageMaker for analytics/machine learning.

  7. Moving forward, You can also Right Click the finished job -> View Job to view tasks performed on the compute nodes. You can also copy the job and edit the job based on a number of HPC job settings relating to core/node allocation etc for additional job creation. Great job!

FAQ

Disclaimer

This package depends on and may incorporate or retrieve a number of third-party software packages (such as open source packages) at install-time or build-time or run-time ("External Dependencies"). The External Dependencies are subject to license terms that you must accept in order to use this package. If you do not accept all of the applicable license terms, you should not use this package. We recommend that you consult your company’s open source approval policy before proceeding.

Provided below is a list of External Dependencies and the applicable license identification as indicated by the documentation associated with the External Dependencies as of Amazon's most recent review.

THIS INFORMATION IS PROVIDED FOR CONVENIENCE ONLY. AMAZON DOES NOT PROMISE THAT THE LIST OR THE APPLICABLE TERMS AND CONDITIONS ARE COMPLETE, ACCURATE, OR UP-TO-DATE, AND AMAZON WILL HAVE NO LIABILITY FOR ANY INACCURACIES. YOU SHOULD CONSULT THE DOWNLOAD SITES FOR THE EXTERNAL DEPENDENCIES FOR THE MOST COMPLETE AND UP-TO-DATE LICENSING INFORMATION.

YOUR USE OF THE EXTERNAL DEPENDENCIES IS AT YOUR SOLE RISK. IN NO EVENT WILL AMAZON BE LIABLE FOR ANY DAMAGES, INCLUDING WITHOUT LIMITATION ANY DIRECT, INDIRECT, CONSEQUENTIAL, SPECIAL, INCIDENTAL, OR PUNITIVE DAMAGES (INCLUDING FOR ANY LOSS OF GOODWILL, BUSINESS INTERRUPTION, LOST PROFITS OR DATA, OR COMPUTER FAILURE OR MALFUNCTION) ARISING FROM OR RELATING TO THE EXTERNAL DEPENDENCIES, HOWEVER CAUSED AND REGARDLESS OF THE THEORY OF LIABILITY, EVEN IF AMAZON HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. THESE LIMITATIONS AND DISCLAIMERS APPLY EXCEPT TO THE EXTENT PROHIBITED BY APPLICABLE LAW.

EnergyPlus version 9.5.0 - https://energyplus.net/licensing

aws-cfn-windows-hpc-template's People

Contributors

derekngu avatar hyandell avatar julienlepine avatar nhira avatar qb411 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-cfn-windows-hpc-template's Issues

Issue with Node.JS runtime

We have found another issue with the script. This is due to the runtime for nodejs:

"FindSnapshotFunction": {
			"Type" : "AWS::Lambda::Function",
			"Properties" : {
				"Code" : {
					"S3Bucket" : "<BUCKETNAME>",
					"S3Key" : "lambda/find-snapshot.zip"
				},
				"Description" : "AWS Lambda function for searching snapshot based on name",
				"Handler" : "find-snapshot.handler",
				"MemorySize" : "128",
				"Role" : { "Fn::GetAtt" : [ "FindSnapshotRole", "Arn" ] },
				"Runtime" : "**nodejs**",
				"Timeout" : "25"
}

The NODEJS runtime must now be changed to "nodejs4.3" for the runtime. Othewise the script will fail to run properly from within Lambda.

Issue with SnapshotID parameter in 2-cluster.json

{
"AWSTemplateFormatVersion" : "2010-09-09",
"Description" : "This CloudFormation stack creates a Microsoft HPC Pack 2012 R2 Cluster in an existing environment. WARNING This template creates two or more Amazon EC2 instances. You will be billed for the AWS resources used if you create a stack from this template.",
"Parameters" : {
"KeyName" : {
"Description": "Name of an existing EC2 Key Pair",
"Type" : "String",
"Default" : ""
},
"SnapshotID" : {
"Description" : "Snapshot Id of the HPC Installation data",
"Type" : "String",
"AllowedPattern" : "snap-([0-9a-fA-F]{8})",
"ConstraintDescription": "must be a valid snapshot ID."
},

Under SnapshotID, the "AllowedPattern" parameter failed for me. Granted, I had to copy my snapshot between regions, and I noticed the replicated snapshotID was larger than the original. So, I changed line 13 of the json file to this:

"AllowedPattern" : "snap-([0-9a-zA-Z]*)"

So the final version looks like this:

{
"AWSTemplateFormatVersion" : "2010-09-09",
"Description" : "This CloudFormation stack creates a Microsoft HPC Pack 2012 R2 Cluster in an existing environment. WARNING This template creates two or more Amazon EC2 instances. You will be billed for the AWS resources used if you create a stack from this template.",
"Parameters" : {
"KeyName" : {
"Description": "Name of an existing EC2 Key Pair",
"Type" : "String",
"Default" : ""
},
"SnapshotID" : {
"Description" : "Snapshot Id of the HPC Installation data",
"Type" : "String",
"AllowedPattern" : "snap-([0-9a-zA-Z]*)",
"ConstraintDescription": "must be a valid snapshot ID."
},

Issue with running this on base AWS Windows 2016 instance

Found a problem with the zip routine in the publish.ps file.
Publish,ps1 file called for the following assembly:
Add-Type -assembly "System.IO.Compression"
This does not work, and the zip function that follows fails as it cannot find the type:
Unable to find type [System.IO.Compression.ZipFile].

At C:\Users\Administrator\Downloads\aws-cfn-windows-hpc-template\publish.ps1:64 char:14
+   $archive = [System.IO.Compression.ZipFile]::Open($TmpName, [IO.Comp ...
+              ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (System.IO.Compression.ZipFile:TypeName) [], RuntimeException
    + FullyQualifiedErrorId : TypeNotFound

Found the following link:
https://blogs.technet.microsoft.com/heyscriptingguy/2015/03/09/use-powershell-to-create-zip-archive-of-folder/

This shows the following command:
Add-Type -assembly "system.io.compression.filesystem"

I've added the following to my copy and this seems to alleviate the problem:

Add-Type -assembly "System.IO.Compression"
Add-Type -assembly "System.IO.Compression.filesystem"

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.