As the end of GSoC 2018 approaches, I have been adding the few remaining features from the proposal and polishing the rough edges of my project. Here is my week by week progress for last few weeks. If you want to jump straight into plan for the last coding week and submission week, click here
Week 9
As you might remember from the last post, I was able to regain some momentum after being away for a couple of weeks. This week, I continued on with that momentum and created the home page, with a table showing the compliance table for public servers. Being a pretty large table, it can’t entirely fit in a single screen. The user will have to scroll a lot to compare different values and come back to the top (or left) to see which test(or server) they are actually looking at. So, we needed to have the header and first column fixed at the top and left respectively to prevent this inconvenience. Refer to the GIF if I couldn’t explain it properly. I found this great repository that does exactly that. With some minor adjustments, I was able to use that for the main compliance table. Other than that, I worked on improving and optimizing the SQL queries and general bug fixes.
Spoiler Alert: It didn’t turn out to work all that great for my use case. |
Week 10
I worked on writing tests and improving the database code. I was able to cover all the database methods with unit tests. It helped me spot and fix couple of bugs in the database code, and make improvements to it. I finally got to realize the power of writing tests. Though I could not do it for this project, but I have decided to go with Test Driven Development for all my future projects. After completing the rather monotonous task of writing unit tests, I proceeded on to make an email subscription system for sending alerts related to servers’ compliance result. Before I even began working on it, my mentor kindly pointed out resources to help me in sending emails. It was one of the more interesting things to do in my project. To subscribe to a server, a user has to verify their email id by going to a verification link, which is generated by adding a random UUID to the timestamp of the request. All the verification codes are stored in a Map with the details about the request. As soon as a verification link is opened, a new subscriber object is created and added to the database. A subscriber object has three values, an email id, the server to which the subscription is made and the code for unsubscribing. The link to unsubscribe from the alerts for a server is present in all emails sent. In the end, we have email reports for the following events:
- Removal of credentials
- Updating of credentials
- Passing/failing a new test
- Unavailability for test results I was able to create fairly decent looking HTML emails with plain-text fallback.
Week 11
Now that all of the features of the project were done, my mentor gave me some feedback on improving and adding final touches to the project. For e.g.: The instructions for passing failed tests, were contained in a json file like this:
{
"name": "roster_versioning",
"possible": true,
"since": "v0.4",
"instructions": "In mod_roster, set versioning to true.<br>mod_roster:<br> versioning: true<br>",
"modulesRequired": [
{
"name": "mod_roster",
"type": "core"
}
]
},
{
"name": "xep0191",
"possible": true,
"modulesRequired": [
{
"name": "mod_blocking",
"type": "core"
},
{
"name": "mod_privacy",
"type": "core"
}
]
},
It looked like a good idea to me at the beginning, but we ended up having a lot of messy HTML in the JSON file, and the whole system turned out to be highly inflexible and untidy. So, my mentor suggested me to switch to simple markdown files for helps. We now have the following structure for helps, and each individual help file is plain-old markdown.
help
|
|-prosody
| |
| |-xep-0363.md
| |-xep-0384.md
I implemented all of the feedback items from the mentor and other tweaks I felt necessary. But, most of the week was spent trying to come up with a good logo. I tried several ideas but all of them sucked, so I had to be content with a mediocre logo. I also ran a twitter poll for this logo, and the results were disappointing ☹️
But, the week ended on a high note by receiving great positive feedback from the community about the website 😃
Plan ahead
Last week, I noticed that the table for showing compliance status starts to lag pretty badly when there were more than 100 servers. In fact, it started taking a good couple of seconds in older phones. Moreover, on scrolling the header started to wobble a lot even on desktop Firefox. Now, this is far from ideal, as even the current compliance table contains more than 100 servers already. So, the first thing to do this week will be to trying to come up with something more efficient for showing the table. As you might have read above, I had lazily used a library without paying much attention to the constraints for this project. I have been mulling over it in the weekend and have a few promising ideas of implementing it in a much more efficient way.
Other than that, I will also try to make it more obvious that clicking on failed tests shows a help pop up. But, the main last thing will be to separate the web server component from the main compliance tester. This way we can have a command line tool for running tests like the old Compliance Tester used to have. That will make it a complete replacement for the existing tool. Once that is done, I will do some quality control and testing to make sure it works consistently across all machines.
Going ahead in the week of submission, I plan to rewrite the README.MD file and the About page to better explain things. I will also write a CONTRIBUTING.MD file to help people like server software developers, to write the help files.