Public Lab Research note

GSOC-18 Email notification overhaul

by vidit |

Read more:

About me

Name Vidit Chitkara

Affiliation Guru Gobind Singh Indraprastha University, New Delhi, India-110078

Location Delhi, India-110034

Github ViditChitkara



Project Title Email Notification Overhaul

Gitter ViditChitkara

Project description

Email notification overhaul.

Abstract/summary (<20 words):

Reply to comments by email, send weekly digests to Publiclab users and a separate dashboard to handle digest settings.


The aim of this project is to help users to have a better User experience and have better involvement with PublicLab. The project aims to achieve the following milestones:-

  • Reply to comments by email
  • Daily digests for users (This would help getting all the preferred content at one place).
  • Users will be able to configure digest related settings.


1. REPLY TO COMMENTS BY EMAIL: In the current implementation of plots 2, one needs to visit the website (to a particular node) in order to reply to comments or simply comment. One of the better implementations could be, to use the mailman gem which would help us reply via email. We need to follow the following steps:-

  1. Add mailman gem to gemfile and do bundle install to install the dependencies.
  2. Now we need a script to configure the mailman settings. This would go in the script folder (script/mailman_server).
  3. There could be 3 methods through which we could receive an email, we could use pop3 as the method. Pop3 would require receiver's information such as username, password, etc. The script would now extract the incoming mails in the form of a message object and then redirect it to a "receive_mail" method in Comment model.
  4. The "receive_mail (message)" method would contain logic regarding the comment. It could be implemented as:-
    i) Identify node id from message subject using appropriate regex (something like- /^#(\d+)$/).
    ii) Identify user from message object. (message object provides a message.from method to identify sender's email).
    iii) Identify the comment body from the message object. The message.body.decoded method gives us the comment body.

image description

2. Creating comment from received and decoded mail by mailman and design plan for them Now that we have all the necessary information required for making a comment, we need to follow the following steps to create a comment:-

  1. Firstly we need a email_reply (boolean) column in the comments model to identify between normal comments and reply by email comments. So a new migration file would be made which would add the email_reply column to the comments table.
  2. The next step would be to execute a create query on Comment model for the identified node. The query would be like node.comments.create(user: user, body: body, email_reply: true)
  3. The last step would be to render the comment on the associated node with an icon to differentiate between normal and email replies. A working screenshot of this step is shown in the following screenshot.

image description image description

3. Daily digest Email design:
A basic template of daily digest mail is shown in the following screenshots:-
image description
image description

This part could be broken down into the following parts:-

  • A separate method, e.g. "send_digest(user,top_picks)", where user and top_picks are method arguments corresponding to Publiclab user and their subscriptions/digest notes respectively. This method would go in the subscription.rb file.
  • A possible design for the digest is shown in the above screenshot.
  • A method is already there in user.rb (content_followed_in_period(start_time,end_time)) which gives us the digest notes. These notes would be passed (as top_picks) in the 'send_digest' method in subscription mailer.

4. Using activejob for sending digest mails:

Sending daily digests to thousands of users per day (by creating a basic synchronous request to server) would not be feasible at all. We require a service that can make asynchronous requests, so that there is not much load on the servers. Rails provide a built in service for handling asynchronous tasks known as "active-job". "Active job" is a framework for declaring jobs and making them run on a variety of queuing backends. These jobs can be everything from regularly scheduled clean-ups, to billing charges, to mailings. Anything that can be chopped up into smaller units of work and run in parallelly.
Currently we are using rails 4.1.16 in the Publiclab app, which does not support activejob. So we first need to upgrade our app from rails 4.1.16 to version 4.2.6 (a stable release of rails 4.2.6).
For queuing and executing jobs in production we need to set a queuing backend. We can use "Resque" as the adapter for handling background tasks. Resque is a redis backed library for creating background jobs, placing those jobs on multiple queues and processing them later.
Since we require to send digest emails daily to multiple users, there should be a scheduler which schedules activejobs at a particular time everyday. For this purpose we can us the whenever gem. Whenever is a Ruby gem that provides a clear syntax for writing and deploying cron jobs. The following architecture and steps explain the basic flow of sending daily digests using activejob:image description

a) Whenever gem creates a cron job which calls the job every 24 hours. A schedule.rb file needs to be created in the config folder. Here, all the jobs are schedule. e.g:-

every :day, at: ‘12:20’ do  
    runner “DailyDigestJob.perform_later”  

b). Since there are thousands of mails which are going to be rolled out at once, they would be called asynchronously by resque (queuing backend). For making an activejob, there is a jobs folder in the app folder. For each worker a separate file is created (daily_digest_jo.rb in our case). All the logic which needs to be executed asynchronously resides here in a "perform" method.

class DailyDigestJob < ApplicationJob
  queue_as :default

  def perform(*args)
    # Do something later
    users = User.where(allow_send_email: true)
    users.each do |u|
        DigestMailer.send_digest(u,Node.all).deliver "Scheduled a job to send digest"

c) Finally the enqueued jobs would be dequeued (handled by resque) and mails would be sent asynchronously.

image description

5. Testing Digest email feature: For testing out the digest email feature, we could send mails to a limited number of people. Only users containing certain tags (e.g. email-tester) would be sent the mail. The tests need to be hardcoded in the source code.

6. Digest email settings: Finally, we should give users an option to configure email related settings. This could be done by adding a form with checkboxes on the subscriptions page. The steps for implementation would be as follows:-

  1. Make a migration file for adding boolean attributes (receive_digest, post_comments, etc.). This is done to store user preferences for receiving mails.
  2. Make a controller method (e.g. configure_user_preferences) which would receive a post request with params consisting of user preferences. This method would will make the relevant database entry in user model.
  3. Make relevant route entries (map route to controller,action).
  4. Finally write some front-end code for the notification design. This would be written in the home/subscriptions.rb file. The following screenshot shows a prototype of the design.
    image description image description


Community Bonding Period [April 23– May 14]-> Discuss with mentor, the relevant issues of the project and requirements for them. Break the feature into several first timer issues, to encourage greater participation. Then start with the prototype of the features.

[May 14 – May 20]-> Implement reply by email part. Read more about mailman gem and how to write best scripts for receiving the mails. Then write mailman script for handling incoming mails.

[May 21 – June 3]-> Make relevant changes in the user model and make the design for the comments. Do intensive testing of the feature.

[June 4 – June 10]-> Write tests for reply by email.

[June 11 – June 15]-> First Evaluation. Write documentation for the reply by email part.

[June 15 – June 21]-> Design basic template for digest email. Make send_digest mailer at the backend along with all the required logic.

[June 22 – June 28]-> Integrate redis/ resque (queuing backend) and testing it.

[June 29 – July 8]-> Make a separate job for sending digest emails. Refine the digest email template.

[July 9 – July 13]-> Second evaluation Write tests for digest email part.

[July 13 – July 19]-> Testing the digest email feature by sending mails asynchronously to a smaller set of people (having certain tags). Write documentation for digest email part.

[July 20 – July 26]-> Implement the UI notification part and write tests for them.

[July 27 – August 6]-> Brushing the code in the above parts, wherever possible. Do testing of all the parts as a complete system. Complete the documentation.


Yes I have forked the plots2 repository on my local machine which is up-to-date with publiclab/plots2.


I have been developing web applications for about an year. Ruby makes life much easier by providing gems for almost everything one would need to build a full fledged web application. This encouraged me to learn web development in rails framework.

I worked as a Teaching Assistant for Web development (Ruby on rails) course in an educational start-up, namely, Coding Ninjas (

Currently I am working as a web developer intern in a healthcare startup known as kvlabs ( is a rails based web application.

I have been contributing to open source softwares for about 6 months. Some of the organizations I have contributed to are openSUSE and Publiclab. In openSUSE I have contributed to OSEM (Open Source Event Manager). OSEM is an open source software tailored for managing free and open source conferences. In public lab I have contributed to plots 2.

Contributions to publiclab

I have been contributing to publiclab (plots2) for the past 5 months. I have 28 merged pull requests which could be viewed here. I have reported 14 issues out of which 4 are first timer issues. The reported issues could be viewed here.


Being an active github user, I personally feel that it is really easy and reliable to reply via comments on a particular pr or issue. There are a lot of answers, questions and comments being posted on publiclab, so it would provide great ease for users replying to questions, answers, comments,etc. via email. Also, regular digests would help publiclab users to remain updated about the content related to environmental issues. Being a pubiclab user, I would love to get to get all my preferred content at a single place. Hence, the above features would help users to get a better user-experience of Publiclab.


I understand that this project requires time commitment equivalent to a full-time job. Since, I am well versed with the project code, I am really pumped up to give my best for this project.

software gsoc soc gsoc-2018 soc-2018 soc-2018-proposals soc-2018-email gsoc-2018-final



Hi, I love the detail with which you've plotted (haha plots2) out the steps to instantiate a comment by email. I want to note that you and @gauravano are proposing work that is adjacent and a little overlapping, which is fine!

Email is a big area -- if you are both accepted into the program, you could both work on different aspects of this, and it's one of our highest-priority projects, so a team of 2 would be great.

Please look over the comments I left on @gauravano's post, and the two of you could potentially work together to break down different parts of the larger project -- one perhaps working on reply by email while the other works on digest subscriptions and "meeting in the middle".

For both you and @gauravano, I think there's some additional planning that could be done on testing. Think about what make up the minimal amount of changes that can be confirmed with a test, and that's the ideal size of a single PR. If you can break a feature in two, and test one half and merge that first, even if it's not "put into use" in the code, but just exists, tested, in the back-end, that's great! Then that code can be merged, and the next PR can "make use of it" to bring it to life with a separately testable "chunk" of code. Does that make sense?

There are some good mailer-based tests in our test suite already -- both of you can link to those by way of example, and think about how a "reply by mail" feature might be tested? With a test mail to be parsed? With several? What could go wrong? There's a lot there!

And best of all, well-written tests can be used as a reference for how a feature 'should be used' -- they are like documentation, with example code!

Is this a question? Click here to post it to the Questions page.

Thank you for the review @warren. We will try to break this up into smaller parts and come up with a solution.

Added some more details to this. @warren, @sagarpreet, @bansal_sidharth, @gauravano, @namangupta, @Raounak, @rishabh07, feedbacks on this would be really helpful.

The demo site looks great. I'm eager to see how we might test out scheduled-sending emails in a 'test mode' before bringing the entire system online. Maybe an email that's sent only to me, hard-coded in, or just to users who have a certain user tag? Then we can bring online the more complete system piece by piece.

Can you discuss with @icarito about the queuing backend? It makes sense that it's a good idea, and we should think through how to deploy it, etc. -- would it live in our GitHub project? Would it run on the same container?

Instead of email_reply let's add a source to the comment model. Then we can in the future think about other sources too -- maybe Twitter!

What if the user sends from the wrong email? Could we respond with an email that says "we don't recognize this email -- click here ( to add another email to your profile" and have a button for that? We could store it as a hidden user tag, maybe? not sure.

receive_mail might be named Comment.from_email maybe? I like your idea there!

One more -- send_digest -- could that just be a method of User? So, user.send_digest(nodes) maybe?

Thanks for thinking this through so deeply!

Is this a question? Click here to post it to the Questions page.

Nice work out there @vidit . It seems to be well explanatory. You can have a look on @warren suggestion to make it more perfect.

Instead of email_reply let's add a source to the comment model. Then we can in the future think about other sources too -- maybe Twitter!

Great idea!

What if the user sends from the wrong email?

This might be the case when user exactly knows the node_id to which it is replying. In our case, user will reply (on an email thread) with a pre-filled subject. We need to find out a way with which the subject of the reply mail would be auto-filled with node_id (like #23). However, for safety sake we could implement the solution you gave (storing email as a hidden user tag). Your thoughts??

receive_mail might be named Comment.from_email maybe? I like your idea there!

That's great. Or something like Comment.generate_from(email) ?

One more -- send_digest -- could that just be a method of User? So, user.send_digest(nodes) maybe?

How about having a user.send_digest(nodes) method which would internally call a method (of subscription mailer) to deliver the mails ?

Is this a question? Click here to post it to the Questions page.

Hi @icarto, How about having resque as the queuing backend (required for activejobs). We could use Capistrano Resque in association with resque in production. Resque provides a really nice interface to monitor all the background jobs as shown below:- resque-dashboard.png

Your thoughts on this??

Is this a question? Click here to post it to the Questions page.

I'm eager to see how we might test out scheduled-sending emails in a 'test mode' before bringing the entire system online. Maybe an email that's sent only to me, hard-coded in, or just to users who have a certain user tag? Then we can bring online the more complete system piece by piece.

For doing this in the actual app we first need to migrate to rails 4.2 so that actvejob functionality could work. After porting to rails 4.2 we could follow the exact steps as I did in the demo app to integrate activejob, resque and whenever. For testing purpose we could send mails to certain selected user (as suggested) with certain tags (e.g. tester). Currently the demo site (having activejob functionality) sends mails every minute(the testing could certainly be turned off by user). @warren kindly give your views on this.

Is this a question? Click here to post it to the Questions page.

Thanks @Raounak for the feedback.

Hi @vidit, Sincerely I am unfamiliar with the options exposed. From a superficial read, does this mean we need to add Redis to our deployment? I've not used it in the past, but I'm open to learn it if we have to use it! In general please make sure any dependencies we integrate must be supported upstream. For instance the capistrano-resque repo you linked hasn't had a commit in 1.5 years and nobody responds to issues.

Is this a question? Click here to post it to the Questions page.

Yes @icarto, we need to add redis to our deployment. Redis is required for most of the 3rd party queuing libraries (e.g resque, sidekiq, etc). As an alternative to resque we could use delayed_job ( although resque is a far better option if we have larger amount of async jobs at a time). Capistrano is an optional utility to automate deployment. So it's our choice if we use it in our app. It's just that everytime we need to restart our servers we need to run a resque command along (rake resque:work). In my demo app I haven't used Capistrano and it works fine. Please correct me if I am wrong and kindly give your feedback on this.

Okay Vidit maybe @warren has an opinion too. I have read the ActiveJobs documentation for rails and understand there are options for queue backend. From the list, I think Redis has the most support.

For deployment strategy, we don't use capistrano but we use docker-compose. Currently in production this is how we run MariaDB and expect to run the rails application soon too. There is a Redis image you can use.

@vidit nice proposal

Hi @vidit, I read your proposal it looks good , and also your implementation part is perfect, But please divide your work into weekly schedule it also makes your proposal nice. And your proposal is very nice.

Thanks @bansal_sidharth2996 and @mkashyap354 . Sure @mkashyap354 , I'll add a timeline soon.

@mkashyap354 , @warren . I have added added the last part of the proposal (User preferences) and also the timeline. Please have a look and suggest changes.

Hi @vidit ! Great mockups and i always love the flowcharts !

The proposal is very well documented ! Just a doubt here in the digest email settings :

I haven't used whenever gem ,and was wondering if it is possible to change the scheduler task dynamically ? Like if a user does not wants to receive digest email everyday/every week but wants to receive digest email after every 2 weeks or every month ? Is it possible ?

Thanks and I hope this helps :)

Is this a question? Click here to post it to the Questions page.

Thanks to everyone for thinking this through so carefully. It looks like a great proposal -- i echo @sagarpreet's questions but to be clear we don't have to answer every question before the deadline -- if this project is accepted, we'll have time to figure things out!

Best of luck and thanks!!!

That's quite a task @sagarpreet. I don't think it would be feasible to dynamically update the scheduler. I found one possible solution here. We could keep a column in user model (next_scheduled_email:datetime) which would be updated each time the mail is sent to user. e.g. If I want to receive digest mails every two days, SubscriptionMailer.send_digest().deliver_later(wait_until: 10.hours.from_now) (suppose) and update my next_scheduled_email to be 29th March, 2018 and so on. This could be done for each user, each day (maybe at the beginning of the day). For each day only those users should be selected whose 'next_scheduled_email' is Today's date). Not sure if it would work, @warren is it worth giving a go?

Is this a question? Click here to post it to the Questions page.

Thanks @warren and @sagarpreet for going through this.

You must be logged in to comment.