Streaming Steam games from Amazon EC2 to Steam Link over OpenVPN tunnel featuring Pfsense and VMware

Oh have I longed to write this blog post, ever since I bought a Steam Link for myself as a christmas gift I’ve been wanting to make use of it. I’m the kind of person who sometimes (a bit too often) buys stuff first and motivates the purchase later (sometimes with a bit too much infrastructure).

Anyways this blog post was a starting point for me:
Revised and much faster, run your own high-end cloud gaming service on EC2!

Back in February I gave it a try but never got it to work, I wasn’t able to ping my local machines from my EC2 machine over my OpenVPN tunnel. This confused me a lot and I left it for a while. Tried again last week and got it to work, the magic was that since I’m running my Pfsense instance in VMWare I had to set my network card in promiscuous mode (yes it’s called that and it means basically that it sends packets everywhere).

After network card was in promiscuous mode everything just worked out, I downloaded a couple of games and when I started a Steam client on my local network it just said that I could start streaming from the Windows machine I had in EC2.

In the blog post above the connection is made from your local machine to EC2 but I’m doing it in the other direction so I’m going to explain that in more detail here. Also since the premade EC2 Gaming AMI is a couple of years old I had to update Windows, Steam and Nvidia drivers but I’ll go through that too.

EC2

These are the steps needed to get the machine up and running in EC2, refer back to the original blog post for details.

  1. Launch the ec2gaming machine in EC2 as a g2.2xlarge spot instance, this is documented in the blog post already. I create a Security Group with full access for my public IP address, you can of course be more restrictive by only allowing RDP.
  2. Connect to the PC (this works even on a Mac with Microsoft Remote Desktop)
  3. Change the password on first login (you don’t have a choice)
  4. Run Windows update (this will download about 1 GB of updates as of July 2018)
  5. Download Nvidia drivers NVIDIA GRID K520/K340 RELEASE 320 from here and upgrade them
  6. Uninstall OpenVPN (from the Start menu) and download a newer version from here. Don’t install OpenVPN Service, it’s not needed.
  7. Now is the time to take a snapshot of the machine since a spot instance is always terminated when you turn it off. You can do this manually from the AWS Console or using the gaming-down.sh script as described no the blog, if using the scripts in the future it’s a good idea to create an IAM user with limited access since the credentials are in clear text in the script.

I’ve created a pretty narrow policy for the IAM user that runs gaming-up.sh and gaming-down.sh

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": "ec2:DeleteSnapshot",
            "Resource": "arn:aws:ec2:*::snapshot/*"
        },
        {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": "ec2:TerminateInstances",
            "Resource": "arn:aws:ec2:*:*:instance/*"
        },
        {
            "Sid": "VisualEditor2",
            "Effect": "Allow",
            "Action": [
                "ec2:DescribeImages",
                "ec2:DescribeSpotPriceHistory",
                "ec2:CancelSpotInstanceRequests",
                "ec2:DeregisterImage",
                "ec2:DescribeInstances",
                "ec2:RequestSpotInstances",
                "ec2:CreateImage",
                "ec2:DescribeSpotInstanceRequests"
            ],
            "Resource": "*"
        }
    ]
}

Pfsense

I’m using Pfsense at home instead of a normal router, it runs in VMWare ESXi (5.1 at the moment but upgrade is coming) and works like a charm. I will not go into details about Pfsense since I assume if you’re reading this you are kind a geek anyways. Follow the steps below to set up a OpenVPN server in Pfsense that your EC2 machine can connect to.

I used the information in this blog post to set up OpenVPN:
Create a stretched LAN between your site and vCloud using pfSense

  1. Create the OpenVPN server according to these settings, instead of using screenshots I printed my configuration page as a PDF. Most of it is standard and it’s all described in the blog post about the stretched LAN
  2. Go to Interfaces / Interface Assignments and assign the aws-lan-bridged Network port as OPT1 or whatever name you like
  3. The firewall will probably have created some rules for you OpenVPN server so might not have to create the ones for the inbound traffic (WAN port 1194) but create the other rules as described in the blog post.
  4. Create the Bridge as described (it should consist of LAN and OPT1

That’s what you need on the Pfsense side of things but if you’re like me using VMWare as a hypervisor you will need to do 1 more thing as I found here after serious Googling why I couldn’t reach my internal network from EC2.

Login to your ESXi and from the command line you need to issue a command kind of like this:

esxcli network vswitch standard policy security set --allow-promiscuous=true --vswitch-name=vSwitch0

Assuming your vswitch is named vSwitch0, I only had 1 so it wasn’t that hard but please refer to the VMWare documentation. Your version might differ from mine since I’m on ESXi 5.1.

Connecting

We have an EC2 machine and we have Pfsense OpenVPN server. Now we need a client configuration for the Windows machine and it looks like this:

dev tap
persist-key
cipher AES-128-CBC
auth SHA1

resolv-retry infinite
proto udp
remote YOUR-PFSENSE-HOSTNAME 1194
keepalive 10 60
ping-timer-rem
<secret>
#
# 2048 bit OpenVPN static key
#
-----BEGIN OpenVPN Static key V1-----
THIS BLOCK SHOULD BE COPIED FROM
Shared Key
IN THE 
Cryptographic Settings
SECTION OF THE OPENVPN SERVER
CONFIGURATION IN PFSENSE
-----END OpenVPN Static key V1-----
</secret>

Create a file called client.ovpn and on your Windows Server then right-click on the OpenVPN GUI taskbar icon and chose Import file…

Right click the OpenVPN GUI icon again and you should have a menu option client and under that connect. Chose connect and you should be connected to your LAN.

Steam

We left the fun stuff for last, open Steam and login with your credentials and make sure it’s configured for streaming, this is described in the first blog link. On your local network your other Steam client(s) should pick up that there’s a new device available for streaming.

Boot up your Steam Link and enjoy gaming!

Beware of shutting down the streaming server from Steam Link, this will terminate the instance since it’s a Spot Instance.

Talk to the fridge! (using Alexa, Salesforce and Electric Imp)

Long time no blog post, sorry. I have meant to write this post forever but I have managed to avoid it.

Anyways, consider the scenario when you sit in your couch and you wonder:
– “What’s the temperature in my fridge?”
– “Did I close the door?”
– “What’s the humidity?”

You have already installed your Electric Imp hardware in the Fridge (Best Trailhead Badge Ever) and it’s speaking to Salesforce via platform events, you even get a case when the temperature or humidity reaches a threshold or the door is open for too long.

But what if you just want to know the temperature? And you don’t have time to log into Salesforce to find out.

Alexa Skills to the rescue!

Thanks to this awesome blog post:
https://andyinthecloud.com/2016/10/05/building-an-amazon-echo-skill-with-the-flow-api/

And this GitHub repository:
https://github.com/financialforcedev/alexa-salesforce-flow-skill

And example Flows from here:
https://github.com/financialforcedev/alexa-salesforce-flow-skill-examples

I’ll walk you through what’s needed to speak to your fridge.

I will only show the small pieces you need for setting this up, for details please read the original blog posts.

First of all you need an Alexa Skill, I have created one called Salesforce.

This is the interaction model:

{
  "intents": [
    {
      "intent": "FridgeStatus"
    }
  ]
}

And the Sample Utterances

FridgeStatus How is my fridge

I’ll not go into details about Lambda and the connected app needed, please refer to this documentation:
https://github.com/financialforcedev/alexa-salesforce-flow-skill/wiki/Setup-and-Configuration

The important thing here is the FridgeStatus in the Sample Utterances, you’ll need a flow called FridgeStatus.

Here’s mine:

Going into details:

And creating the response:

The Value is:

Your fridge temperature is {!Temperature} degrees celcius, the humidity is {!Humidity} percent, and the door is {!DoorStatus}

The result sounds like this:

So the next time you wonder about the temperature in the fridge you won’t have to move from the couch, awesome right?

The next step would be to ask Alexa about “What’s the average temperature during the last day?” and calculate the average from the BigObjects holding my temperature reading.

Cheers,
Johan

Uploading CSV data to Einstein Analytics with AWS Lambda (Python)


I have been playing around with Einstein Analytics (the thing they used to call Wave) and I wanted to automate the upload of data since there’s no reason on having dashboards and lenses if the data is stale.

After using Lambda functions against the Bulk API I wanted to have something similar and I found another nice project over at Heroku’s GitHub account called pyAnalyticsCloud

I don’t have a Postgres Database so I ended up using only the uploader.py file and wrote this Lambda function to use it:

from __future__ import print_function

import json
from base64 import b64decode
import boto3
import uuid
import os
import logging
import unicodecsv
from uploader import AnalyticsCloudUploader

logger = logging.getLogger()
logger.setLevel(logging.INFO)

s3_client = boto3.client('s3')
username = os.environ['SF_USERNAME']
encrypted_password = os.environ['SF_PASSWORD']
encrypted_security_token = os.environ['SF_SECURITYTOKEN']
password = boto3.client('kms').decrypt(CiphertextBlob=b64decode(encrypted_password))['Plaintext'].decode('ascii')
security_token = boto3.client('kms').decrypt(CiphertextBlob=b64decode(encrypted_security_token))['Plaintext'].decode('ascii')
file_bucket = os.environ['FILE_BUCKET']
wsdl_file_key = os.environ['WSDL_FILE_KEY']
metadata_file_key = os.environ['METADATA_FILE_KEY']

def bulk_upload(csv_path, wsdl_file_path, metadata_file_path):
    with open(csv_path, mode='r') as csv_file:
        logger.info('Initiating Wave Data upload.')
        logger.debug('Loading metadata')
        metadata = json.loads(open(metadata_file_path, 'r').read())

        logger.debug('Loading CSV data')
        data = unicodecsv.reader(csv_file)
        edgemart = metadata['objects'][0]['name']

        logger.debug('Creating uploader')
        uploader = AnalyticsCloudUploader(metadata, data)
        logger.debug('Logging in to Wave')
        uploader.login(wsdl_file_path, username, password, security_token)
        logger.debug('Uploading data')
        uploader.upload(edgemart)
        logger.info('Wave Data uploaded.')
        return 'OK'

def handler(event, context):
    for record in event['Records']:
        # Incoming CSV file
        bucket = record['s3']['bucket']['name']
        key = record['s3']['object']['key']
        csv_path = '/tmp/{}{}'.format(uuid.uuid4(), key)
        s3_client.download_file(bucket, key, csv_path)

        # WSDL file
        wsdl_file_path = '/tmp/{}{}'.format(uuid.uuid4(), wsdl_file_key)
        s3_client.download_file(file_bucket, wsdl_file_key, wsdl_file_path)

        # Metadata file
        metadata_file_path = '/tmp/{}{}'.format(uuid.uuid4(), metadata_file_key)
        s3_client.download_file(file_bucket, metadata_file_key, metadata_file_path)
        return bulk_upload(csv_path, wsdl_file_path, metadata_file_path)

Yes the logging is a bit on the extensive side and make sure to add these environment variables in AWS Lambda:

SF_USERNAME - your SF username
SF_PASSWORD - your SF password (encrypted)
SF_SECURITYTOKEN - your SF security token (encrypted)
FILE_BUCKET- the bucket in where to find the mapping file
METADATA_FILE_KEY- the path to the metadata file in that bucket (you get this from Einstein Analytics)
WSDL_FILE_KEY - the path to the wsdl partner file in the bucket

I added an S3 trigger that runs this function as soon as a new file is uploaded. It has some issues (crashing with parenthesis in the file name for example) so please don’t use this for a production workload before making it enterprise grade.

Note: The code above only works in Python 2.7

Cheers