Rails/Devise authenticating using AWS/Cognito Identity


For a while now, I’m developing a sort of IoT controller with Rails 4. Until now, Devise was used to authenticate users locally using the Devise’s provided  :database_authenticable module.

Things changed recently, and I had to move some features of this IoT controller toward AWS. Authentication against AWS/Cognito Identity is one part of the project.
But Why?” you might wonder. Because we need to extend authentication to other products, using a common user database. Cognito fits the description and helps boost the development on the AWS ecosystem. In other words, we want to be able to use our IoT controller alongside with other new products without depending exclusively on the authentication part of the IoT controller.
This means that local user database will still be used for the application-related data like permissions, session, extended profile attributes or database relationship, but pure authentication happen exclusively on Cognito. I moved to AWS just the first A of AAA. This is a disputable choice, let’s call it team work.

So, the purpose of this post is to synthesize the procedure required in order to:

  • authenticate against Cognito
  • handle passwords manipulation and accounts validation
  • migrate user data with a Lambda triggered by Cognito
  • create new local users if they are not already existing

The following example should work well with recent Rails versions.


Obviously Rails and Devise are required, in addition to the following gem:

Rails 4/Devise adaptation

Environment variables will carry all the AWS specific access information to Rails. Depending on the setup,   /.env file probably already exists in which we can add the two following Cognito Pool info (TLDR; Create a new App client without App client secret) :

where the variables values matches the AWS setup.

We’ll need a new a new Devise strategy, a safe location for this file is in  /config/initializers/cognito_authenticable.rb :

This strategy will create a new local user if Cognito authenticated it, but the user doesn’t exist locally.

The we can configure Devise to use the strategy by giving it the class name in the file  /config/initializers/devise.rb

Now Devise should authenticate against the Cognito user database.

The next step would be to allow password reset using the password recovery system of AWS Cognito.

In this case, the sign up procedure is still handled by Rails on the local database (see below), therefore account confirmation (with Devise’s :confirmable  module) is handled by Devise without any changes.

Migrating users from Rails to Cognito

We wanted to migrate a user from the Rails database to Cognito if the user isn’t already existing in the Cognito database. For that at least a new endpoint in config/routes.rb  is required:

For this route it is supposed that controllers/users_controller.rb  have a aws_auth  method:

That’s all for the Rails side, now in Cognito Pool of the AWS console there is a Trigger, in which lambdas can be attached. The User Migration trigger can be set to the following JavaScript lambda:

This scripts use the two following environment variables, which should be set with the proper information:


It’s possible.

Raspberry Pi 3 and bluetooth speaker setup [command line]

Streaming sound from a Raspberry Pi 3 to a bluetooth speaker was not as easy as it should. Internet is overloaded with outdated information about this issue. So I compiled a small tutorial for the latest Raspbian (at the date of writing). My setup is composed with the following elements:

Trying to connect the device to the Raspberry Pi with the following code will raise an error:   Failed to connect: org.bluez.Error.Failed

It is not obvious, but what is missing to the connecting device is the answer from the host saying that it can stream sound. In order to send the correct answer to the device, pulseaudio should be started first and should correctly load the bluetooth module.

First, ensure that all requirements are installed:

Then start pulseaudio an check that the module is indeed loaded:

At this point, if you try to connect your device, this should work. Most of the devices produce a sound when the connection succeed. Within bluetoothctl , it is not required to redo the pairing procedure, but before continuing you may need to ensure that the bluetooth controller is on with [bluetooth]# power on .

Load a MP3 on your Raspberry Pi and play it using mpg123 your.mp3 .

If you restart your Raspberry Pi now, your device will fail to connect again.
pulseaudio is not really ment to start system-wide at boot, but in the case of embedded device it makes sense.

You could use /etc/rc.local  but it’s a bit punk, so instead let’s create a systemd service.
Add the following content to this file /lib/systemd/system/pulseaudio.service :

And start the service by hand using systemctl start pulseaudio . When running systemctl status pulseaudio , you may find a warning about starting pulseaudio system-wide, but don’t worry, with this setup it makes sense.
Add pulseaudio to the list of services to start at boot:  systemctl enable pulseaudio.service

Don’t forget to add both root and pi users to the group pulse-access in /etc/groups :

Note that the group ID 115 may be different on your system.

Reboot your Raspberry Pi and you are good to go, if the speaker in ON, it should be automatically connected to your system.

Let’s encrypt (il buono), rails (il brutto) and heroku (il cattivo)

Il buono

This article is just paraphrasing this one with a bit more accuracy, corrections and cynicism.

First step: Update your rails code on heroku

The route /.well-known/acme-challenge/KEY  should be added to your config/routes.rb  file like so

get '/.well-known/acme-challenge/:id' => 'CONTROLLER#letsencrypt'

where CONTROLLER  is the controller of your choice, in which the method should look like this

and don’t forget to make it “public”, so if you are using cancancan the following line is required on top of your controller file

Push it on heroku

and wait for it be deployed.

Second step: Install require software and generate the key

On ubuntu you can install letsencrypt  command like this

The run the command with root privileges

follow the instructions and when it asks you to verify that the given URL is reachable, don’t presse ENTER but follow third step instead.

Third step: Update Heroku variables

Go on Heroku console, in settings>Reveal config vars and add LETSENCRYPT_KEY  and LETSENCRYPT_CHALLENGE  keys with their corresponding values from letsencrypt  command, a step before.

Restart Heroku within UI or with the following command where YOUR_APP_NAME  is… your app name.

It would be a good idea to try the URL from your browser before coninuing.

Fourth step: Verify the challenge and push certificate to Heroku

If your SSL endpoint is not yet setup on Heroku, take the time and money to do it

Then you will be able to push the certificate to your Heroku instance.

If it’s a certificate update, replace the certs:add  by certs:update  and your are good.

Fifth and last step: Behold!

Give yourself some time for a walk and think about the beauty of living, yet still away from the coming technological singularity.

Slow Query with “LIMIT 1” in Postgresql

Recently I had issues on a production database running slow on tables with more than 1M rows.
This is a complex Ruby on Rails app connected to a Postgresql database, I went through the whole stack to figure out that it’s a simple index problem. Very few or no related doc was found about this issue, therefor I quickly post a summary of my adventure.

In Rails I often write something simple as  ActivityLog.first or Player.find(123).activity_logs.first , where the model ActivityLog have a default scope describe like so:

This leads to a SQL query looking like this:

This is probably one of the most common query in my Rails applications, and certainly not only in my applications.

Unfortunately, this may become a slow query on big tables. Indeed, if this query is often called your app may become from sticky to unusable (the latter was my case).

The following example illustrate the issue.
I recall it: this may be a very common query:

This query on my dev machine, which is way more powerful than our production VPS takes 1’124ms to run. If called only once per user’s click, this would immediately degrade your Apdex score.

As described in Postresql doc and by reading the EXPLAIN (ANALYZE, BUFFERS) output, a lot of disk access caused by the filter on player_id makes the query slow.

The numbers provided by BUFFERS help to identify which parts of the query are the most I/O-intensive.

The planner took the decision to use an index scan on created_at (the ORDER BY  attribute) and then filter player_id which in this situation isn’t the best choice. Disabling it with SET enable_indexscan = OFF;  improve drastically the performances, but it’s not a production solution.

The query execution time drop down to 0.05ms which is factor of 22’480 from the previous planner decision !

As described here another way to trick the planner is to add another field in the ORDER BY  clause.

Which improve the execution time in a comparable way. In a hurry this the solution a adopted, I have quickly patched my scope that way to improve globally the performances. It was safe and easy to commit with the advantage not to expand database indices size.

But for the sake of a better understanding I dug a bit more this issue. I have more than one big table, and I have more than one app online… I don’t like to keep such gray zone and smoking work-around in my code.

First, I tried to help the planner change its decision by tweaking statistics on every column, but it didn’t changed planner decision and didn’t improve the execution time.

Then I realized that simply building an index on player_id, my condition, and created_at, my ordering attribute, should help Postgersql extracting my query while using the actual planner decision.

…and the end result is pleasant enough. 0.03ms – my work is almost done.

On the Rails side a simple migration like the following change my user’s life, and restored my peace of mind.