AWS Transfer for SFTP Explained: A VPC Use Case

Organizations often find themselves needing to make secure file transfers to outside entities such as clients and vendors. Not only do these transfers need to maintain the security and integrity of internal infrastructure, but the process needs to be practical and cost-effective too. Secure-Shell File Transfer Protocol (SFTP) servers used to be the go-to answer for this enterprise requirement, but running these is costly and not necessarily efficient best practice. AWS launched it’s fully managed AWS Transfer for SFTP in answer to this direct dilemma.

Reduce Costly SFTP Overheads

Rather than have to go through the costly process of investing time and money to run an infrastructure setup of SFTP servers, AWS Transfer for SFTP removes all such maintenance overheads. AWS SFTP provides access to specific S3 buckets and prefixes per user. Organizations can fully leverage SFTP to upload, download, and delete files to and from these buckets to external entities with ease.

For resource and performance efficiency in file transfers, AWS SFTP leverages elastic resources to auto-scale according to the transfer workload. It’s also possible to configure an SFTP endpoint and set up client access through web, CLI, and API interfaces. 

Integrate AWS IAM with AWS Transfer for SFTP

Furthermore, thanks to AWS’ mass-service integration for all business-critical environments, AWS Transfer for SFTP supports common internal and external user authentication systems. Authentication simply needs to be set up through custom development and the necessary API Gateway endpoints or by specifying domain names. Administrators can also set custom roles per user to lock down permissions to the S3 storage the files are located in.

Simply open the IAM dashboard and create a new IAM Role before establishing a ‘trust relationship’ with the SFTP Service. Next, create a new IAM policy that enables access to the S3 bucket with SFTP before attaching the policy to the previously created role. We go into this in more detail later in the article. 

Finally, it’s possible to set up a secure SFTP server within an organization’s VPC by creating a VPC endpoint right through to configuring an external tool (FileZilla in this example) for your users to connect with. Simply follow these guidelines or leverage your MSP, such as Ibexlabs, to implement this for you: 

Step #1: Create an AWS Transfer for SFTP VPC Endpoint 

  • Start inside your VPC Dashboard here
  • Click on ‘Endpoints’ before clicking ‘Create Endpoint’.
#Create newVPCendpoint
#CreatenewVPCendpoint2
  • Select ‘AWS Services’ in the ‘Service category’. 
  • Next, choose ‘com.amazonaws.region.transfer.server’, before entering your organization’s information into the ‘VPC’ field and noting its ‘Availability Zones’ and ‘Subnet IDs’. 
#SubnetIDs
  • Confirm that ‘Enable Private DNS Name’ is highlighted. And for ‘Security Group’, choose the appropriate security group that you want to implement for your VPC. By all means, accept the default security group if you want to.
  • Finally, click ‘Create Endpoint’ at the bottom of the page which will move you into an initial pending state. When the endpoint is finished, jot down the ID of the VPC endpoint that you just created as you will need it later. 

Step #2: Creating an SFTP server with a VPC Endpoint 

  • Open the AWS SFTP console here after signing into your AWS management console. 
  • Click ‘Create server’ and configure your VPC Endpoint type as shown in the image below from the list available. 
#Createserver
  • In the ‘Identity provider’ section of the form, opt for the ‘Service managed’ option to store user identities and keys in AWS Transfer for SFTP. 
  • Optional Extras: 
    • To integrate a ‘Logging role’, specify an IAM role that will link Amazon CloudWatch logging to your SFTP user activity. 
    • For ‘Key’ and ‘Value’, input one or more tags as key-value pairs. Select ‘Add tag’ to set up additional tags for your server.  
  • Finally, click on ‘Create’ to launch your server. You will move to the ‘Servers’ page, where your brand new server is configured with a VPC endpoint type. 

Step #3: Create a Network Load Balancer to Point at SFTP 

  • Open your Amazon EC2 console here
  • Choose ‘Create under Network Load Balancer’ from the ‘Load Balancing’ tab in the navigation pane.
#configureloadbalancer
  • Type a name for your load balancer and select either the internet-facing or internal ‘Scheme’ option. An internet-facing load balancer directs client requests across the internet to targets. An internal load balancer routes such requests through private IP addresses to targets.
  • Either keep the default listener settings of TCP traffic on port 22, or modify the ‘Load Balancer Protocol’ and ‘Port’ of the listener, or select ‘Add listener’ to customize the field. 
  • Choose the VPC Availability Zone in use for your Amazon EC2 instances and then select the public subnet each.  
  • Click ‘Next: Configure Routing’. 
#configurerouting
  • Keep the default, ‘New target group’ and enter the appropriate ‘Name’, before setting ‘Protocol’ and ‘Port’ as required. 
  • For ‘Target type’, decide between registering your targets to ‘IP’ or ‘Instance’. 
  • Keep the default health check settings in Health checks’. 
  • Choose ‘Next: Register Targets’. 
#registertargets
  • Enter the Network interface IP address of the VPC Endpoint in the ‘Network’ load balancer targets. 
  • The IP address of the VPC Endpoint can be found in the ‘VPC Endpoint’ section under ‘Subnets’—see below.
#createendpoint
  • Add the IP address of each Subnet to Port 22 which is what SFTP runs under.
#targets
  • Then click ‘Next: Register Targets’
  • Now, it’s possible to access the SFTP server with the DNS name of the load balancer. Just ensure to ‘Add listener’ to ‘Port 22’ on ‘Network Load Balancer’. 

Step #4: Add a User Manually

  • Login to your AWS Account
  • On the ‘Servers’ page, add a user to an SFTP server by selecting the checkbox next to the appropriate one.
  • Choose ‘Add user’ to move to the next screen.
#adduser
  • Input the user name and select the IAM role that you previously created that provides access or create a new IAM role for a specific user. Next, select the policy by clicking below the ‘Home directory. We created the below policy and labeled it ‘sftp-tests’ as per the image.
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AllowListingOfUserFolder",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::bucket_name"
            ]
        },
        {
            "Sid": "HomeDirObjectAccess",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObjectVersion",
                "s3:DeleteObject",
                "s3:GetObjectVersion"
            ],
            "Resource": "arn:aws:s3:::bucket_name/*"
        }
 NOTE:-  This role already created by CFT. we can use it.
Policy:-   
{
  "Version": "2012-10-17",
  "Statement": [
      {
          "Sid": "AllowListingOfUserFolder",
          "Action": [
              "s3:ListBucket"
          ],
          "Effect": "Allow",
          "Resource": [
              "arn:aws:s3:::${transfer:HomeBucket}"
          ],
          "Condition": {
              "StringLike": {
                  "s3:prefix": [
                      "${transfer:HomeFolder}/*",
                      "${transfer:HomeFolder}"
                  ]
              }
          }
      },
      {
          "Sid": "HomeDirObjectAccess",
          "Effect": "Allow",
          "Action": [
              "s3:PutObject",
              "s3:GetObject",
              "s3:DeleteObjectVersion",
              "s3:DeleteObject",
              "s3:GetObjectVersion",
              "s3:GetObjectACL",
              "s3:PutObjectACL"
          ],
          "Resource": "arn:aws:s3:::${transfer:HomeDirectory}*"
       }
  ]
}
  • Configure the S3 bucket which you want to transfer to using AWS SFTP. Input the path to the ‘Home directory’ where your user ends up when they log in using their SFTP client.
  • Input the SSH public key data of the SSH key pair (Add id_rsa.pub), and share the private key to the user.
	Commands	:- ssh-keygen 	
ssh-keygen 
Enter the path where the keys need to save
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa): /opt/BounceX
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /opt/BounceX.
Your public key has been saved in /opt/BounceX.pub.
The key fingerprint is:
SHA256:bBGvsTC8AJl3FcvQtGZtfQhtweEiAYloUTabDVKTBCc 
The key's randomart image is:
+---[RSA 2048]----+
|  EXXoo**..ooo   |
|  =*+Ooo.B ++.   |
| . .+.= X =.+ .  |
|     . B B . .   |
|      . S        |
|       .         |
|                 |
|                 |
|                 |
+----[SHA256]-----+
  • Finally, select ‘Add’ to complete the process and add your new user to your SFTP server of choice.

Step #5: External Login to the SFTP Server

  • Login to the SFTP Server using the command line interface:

sftp  -i private key username@endpoint

  • Login to the SFTP Server using FileZilla:
  • Open FileZilla and click on the ‘Open the site manager’ button as shown in the image below.
  • Enter the following required details, as numbered in the next image below: 
  1. Click on ‘New Site’
  2. Edit the New Site to Name
  3. Go to the ‘General’ tab and enter the ‘Host’ name
  4. Enter the ‘Port’ number
  5. Select the ‘Protocol’ type: ‘SFTP – SSH File Transfer Protocol’ 
  6. Choose ‘Key file’ in ‘Logon Type’ 
  7. Enter the ‘User’ name 
  8. And ‘Browse’ to find the ‘Key file’
  9. Click on the ‘Connect’ button to finish and connect the SFTP server
#FileZilla

And there you have it. From start to finish, connecting AWS Transfer for SFTP with a VPC endpoint to an SFTP server and external login tool such as FileZilla.


Ibexlabs is an experienced DevOps & Managed Services provider and an AWS consulting partner. Our AWS Certified DevOps consultancy team evaluates your infrastructure and make recommendations based on your individual business or personal requirements. Contact us today and set up a free consultation to discuss a custom-built solution tailored just for you.

Rajasekhar Mandava

Leave a Comment

Your email address will not be published. Required fields are marked *

As AWS Certified Consulting Partners, you get more than just extensive cloud expertise and first-rate IT support. Our team gives true meaning to the words “brand ambassadors.”

We leverage our comprehensive industry experience on your business' behalf to resolve system pain points, transform your infrastructure, and work in tandem with you.

All for the growth and acceleration of your company.

Follow Us
Subscribe To Our Newsletter
Copyright © 2020 IbexLabs
Scroll to Top