The NWDUG (North-West Drupal User Group) Unconference was held on the 12th October, and Alastair - one of our Drupal Developers - went along to learn and share knowledge.
An unconference is conference where the attendees decide the agenda topics, usually at the beginning of the event. It’s a great way to learn from peers.
The agenda takes shape
At the start of the day, attendees were encouraged to stick half hour sessions on post-it notes onto a schedule board. The posts could be a traditional format talk, a question and answer session or a workshop and the sessions throughout the day were a good mix of all these. Most were directly related to Drupal but others more obscure with subjects such as mechanics of Tetris or imposter syndrome.
I have something of an interest in front end development, so signed up for a couple of sessions related to front end frameworks and Vue.js and how to integrate this technology with Drupal.
Progressive decoupling with Vue.js
This session started by describing decoupling which is a concept of where the back end (data) and front end (presentation) is separated. A traditional Drupal site does both these things, where the user enters the data into the content manager in the back end and Drupal renders the HTML and returns this to the user. Decoupling is where the front end is generated by perhaps a front end framework such as React or Vue.js.
Progressive decoupling is somewhere in between the traditional idea of a Drupal site and a fully decoupled site. It still requires Drupal to generate HTML but perhaps only a single block or a section of the presentation is rendered by the front end framework. This method of progressive decoupling requires the integration of a front end framework, which in this session was Vue.js, within the theme.
The talk covered a few methods of debugging Vue.js when working with Drupal. There are some “gotchas” that need to be considered as Vue.js uses a similar syntax to Drupal 8’s Twig template engine, that can result in Drupal trying to render data that is actually the responsibility of Vue.
The half hour session was long enough to get a basic understanding of decoupling the Drupal front end using a framework such as Vue.js, but perhaps not long enough to go into great detail. I felt excited about this technology and new way of developing interfaces, and I would certainly consider it for new projects where this technology would be beneficial.
Building Snazzy Sites with Gridsome and Contenta CMS
While the previous talk was focused more on progressive decoupling, “Building Snazzy Sites...” was all about fully decoupling Drupal from the front end by using Contenta CMS and Gridsome. Contenta CMS is a Drupal API distribution specifically targeting those who want to fully decouple their Drupal sites. Gridsome is a Vue.js based framework that enables developers to generate static HTML sites from data that is provided by a CMS (Drupal, Wordpress etc.) or from static Markdown, CSV, YAML files, or simply from a database or API.
The demo showed how Contenta generates JSON, structured using JSON API, that can easily be read by Gridsome. Gridsome then in turn converts the data into static HTML which has the benefit of being extremely fast, as no calls to a database are required, and the ability to be hosted almost any kind of hosting platform. As Gridsome is a Vue.js based framework, it enables the developer to integrate any number of Vue.js components to build complex user interfaces to the data.
This combination of Contenta and Gridsome makes for a very strong and flexible platform for building front end heavy websites with all the benefits, familiarity and stability of Drupal in the back end.
Drupal Q&A Tools
We saw demos of two Drupal developer tools, Drupal PHPQA and Drupal Code Quality Checker, both of which do a similar job but target different ends of the development process.
Drupal PHPQA
Drupal PHPQA combines various tools such as phpcs (PHP Codesniffer), phpmetrics, phpmd (PHP Mess Detector), and generates a report after running that helps a developer improve the quality of their code and adhere to coding standards.
While this tool was promoted as one that is run by a continuous integration tool such as Jenkins, there is certainly no reason why a developer couldn’t run the report while they were developing a particular feature.
Drupal Code Quality Checker
The second tool, Drupal Code Quality Checker, works with the developer as they perform their duties. It integrates with Git, specifically when a developer commits code into their local Git repository. The code changes are processed through Drupal Code Quality Checker. Then, if any of the code fails any of the tests, which like Drupal PHPQA above uses phpcs, plus additional tools such as PhpLint to ensure valid PHP, the developer is notified and a report generated.
One of the benefits of this particular tool is that it works on code the developer is currently working on, and so any legacy code is unaffected and untested.
As a developer, I can see the value of both of these tools and they can be configured to suit a particular requirement. If they’re applied to a legacy project with a significant amount of code that may not pass tests being applied to it, these tools can be configured to ignore it and only look at new code. They provide a great way of checking if your code can be improved upon, and help developers both get into habits of writing quality code. They have the added bonus of giving assurance to the client that the software which is being developed for them complies to standards and is of high quality.
Get your crashes and burns reported
A short talk on two pieces of technology, Reporting API and an associated Drupal module currently in development, Reporting.
Reporting API
Reporting API is a W3C specification that has been developed and recommended by Google, and is already in use in their Google Analytics software.
The essence is to enable developers to provide error reporting that is logged in HTTP headers. Errors such as 404 (page not found), invalid SSL certificates or if a developer is using outdated API code (perhaps an old version of Google Maps) are reported automatically by the user’s browser. The errors are generated and stored in the HTTP header and then recorded in some kind of logging system.
Reporting
In the case of the Reporting module for Drupal, these errors are recorded in Watchdog. This module is very much in development and so currently no reports are generated. However, it demonstrated how reporting is injected into the HTTP headers and how the module could be extended by developers in their own modules to report errors or notifications. This is something that is worth keeping an eye on and may make error reporting an easier prospect, especially if it is built into the browser.
JSON API Reference
We saw a demo of another module in development that allows developers to create a reference field that points to an external source using the JSON API specification. This would enable developers/content creators to create relationships between Drupal content and external data, so perhaps embedding Disqus comments related to a particular Drupal node, or creating relationships, and consequently share content, between two independent Drupal sites using JSON API to connect the two.
A Brief History of Personal Failure
The last session I attended was a discussion on failure, or more specifically how to learn from it. The underlying theme throughout the talk was that failure is a learning process that we can use to grow.
I have not failed, I’ve just found 10,000 ways that don’t work.
~ Thomas Edison
Crispin Read talked about his experience in learning baking sourdough bread as a process and failure was a large part of this. He had intentionally failed numerous times so as to get a better understanding of the process of making sourdough. Along with this idea of using failure as a learning process, it was emphasised that we shouldn’t feel bad about failure and that we shouldn’t confuse failure with ability or worth, and most importantly not to give up as a result of failure.
At the end of the talk various members of the audience gave examples of where they have failed and what they learned from this. It was approached in a humourous way with the rest of the audience cheering on the speaker as they admitted to failure.
The NWDUG Unconference - now on its fourth year - was a great success. There were many different topics of discussion, a great, welcoming atmosphere and interesting chats in between sessions on projects different people had been working on and how they’d gone about them.
The popularity of this years event shows that there is the demand to run it again next year. It was excellent value for money at only £20 for the day and I'm looking forward to attending next years unconference!