9
Apr
2014

Africa road trip: The challenges for app design and development

by Joe Dollar-Smirnov

maureen

After speaking to over 50 locals about the Haller app, making reams of notes, taking over 600 photos and shooting 25 videos of usage and interviews, we are now a few days away from boarding our flight back to Blighty.

Coming to Africa to test a prototype was always going to be as much about the technology available to the people as it was about the usability of the app itself. Dr Rene Haller has been in Africa for nearly 60 years demonstrating how to turn waste land in to rich land that can yield provisions for a family of 5 without ever having to rely on external aid. During the past week we have met the community farmers who his work has had a direct impact on. The app, that has been created in a collaboration between Pearlfisher and Red Badger is showing great potential to bring the information directly in to the hands of the grass roots famers. These are some of the hurdles we are seeing:

how to use computers poster

Exposure

Some of the farmers we are speaking to, before our demos, had never accessed the internet. This means learning what it means to have internet access has to come before using websites. For some of these people, the whole concept is new. The Haller project includes a technology learning centre that is being used to train the locals on the basics of internet use. In order to improve the experience for these people it will be important to offer regular training sessions to build confidence in what for some is a brand new technology.

nokia phone

Mobile Technology

The people we are speaking to, generally,do not have smart phones. We are finding that the majority of our target users have feature phones that are incapable of accessing an HTML website. This is a potential blocker for the adoption of a mobile app! A question we need to consider is how long we wait before saturation of smart phones reaches critical mass? and are we able to get this information in to the hands of the people within the current technological constraints? We have been discussing some of the solutions to make this happen, from the use of community group leaders with smart phones and word of mouth through to SMS based messaging and IVR (interactive voice response) systems. 

Screen Shot 2014-04-09 at 14.18.40

Language

Although English is the first taught language in Kenya, there are 42 different local dialects. We have spoken to groups covering just 2 – Kikuyu and Mijikenda. The Kikuyu people are the largest tribe in Kenya and also happen to be among the most advanced in terms of agricultural development with seasoned tea and coffee farmers as we saw during our first workshop sessions in Nyeri, north of Nairobi. Less developed rural communities speaking local dialects, such as the Mijikenda will prefer to read their local dialect, although so far we have discovered that our target users all know Swahili, but not comfortable with English.

Electricity

Despite everybody already having mobile phones, everybody does not have electricity at their disposal. As we all know a charge of an older feature phone can last for days at a time, but smart phones tend to use a lot of energy. This does not stop at smart phones. Laptops and computers for rural schools suffer from the same issues. Many schools do not have mains electricity – this did not stop the government promising to supply every child in Kenya with a laptop which has currently been put on hold due to inconsistencies in the finances surrounding the program.

chickens

Our research and usability testing concludes with a series of meetings that will happen at the dwellings of our target users. The body of feedback and research is reaching ‘saturation point’, that we are finding similar comments and issues are being raised by different groups of people and are able to form a set of validated recommendations for the ongoing design and development of the application. We’ll summarise our findings on this blog in due course.

5
Apr
2014

Africa road trip: The workshops begin

by Joe Dollar-Smirnov

ux_workshops_begin 

Three days in to the Haller Africa User research field trip.

Day 3

Before we could see anyone or do anything we had to pay a visit to the local government. Devolution is massive right now in Kenya and as a result the local government needs to know everything that is going on in the county. We sat down with a local leader and explained what it is we were intending on doing in his county. Nyeri County. or NYC.

Full background check and 90 minutes later we were free to continue with our business with his aide by our side. Now 5 of us in total and 2 cars we headed to our first makeshift lab.

user experience makeshiftlab

 

We went in to find a group of 13 farmers eagerly awaiting our arrival…

awaiting

 

This is a group of farmers who are excited at the prospect of technology and how it can help them make more money. They come from coffee and tea plantations and are a mix of workers, leaders, old, young, male and female. The typical farmer in this part of the world owns a feature phone, uses SMS and pays for their goods on whats known as MPESA. MPESA means ‘money’ in Swahili and is rural Kenyas leading method of payment for goods. Technology growth here is phenomenal. The current 50% that have smart phones is growing all the time and within 12 months that figure will likely be more like 90% at least.

The Workshops

  • Intro
  • Present what we are doing to them – brief overview / aims objectives
  • Present initial visuals – in particular icons
  • Open brain storm on visuals for key categories icons ad hoc
  • Discussion about websites they use
  • Give out phones with the app and allow them to play with it (max number per group tbc depending on number of phones)
  • Spend time with each group observing and discussing feedback

The group was extremely receptive and after a bit of warming up we soon had the whole group sketching ideas like it was second nature. After a short break for tea (local of course) we passed the phones around for the demo. The amount of people who turned up to the session meant that people had to pass phones around but that seemed to work well as it encouraged collaborative working and discussion. 

They were so impressed that they could choose the language as most of the websites they frequent do not translate in to Swahili. It was interesting to watch how some of the older farmers were interacting with the phones and was clear that for one or two, it was probably the first time they had used a smart phone. 

Here a some other interesting points that came out of the session:

At the end of the usability session we had an open forum for feedback and had an extremely positive response. Even accounting for the Hawthorne Effect it seems that we are on the right path to producing an app that is as engaging and helpful as it is intuitive.

At the end of the day it was time for us to go and witness first hand how the app could be used in their day to day business on their farms. 

 

coffee user experience 

2
Apr
2014

Africa road trip: Day one and two

by Joe Dollar-Smirnov

roadtrip in kenya

Day 1 

Smooth 8 hour flight with some surprisingly palatable morsels served up. They were, in reality, probably, disgusting! – augmented simply with the knowledge that I am soon to be humbled by an altogether less convenient way of life where food does not come from a magic box of tricks all warm and delicious, brought to you on a platter by a shiny person whose mission it is to leave you smiling.

While I’m waiting on the runway for my bus to take me to arrivals I get my wrists slapped by security for taking photos. Tourist moment. 

Taxi driver tells me that everyone has at least 2 phones in Kenya because the coverage isn’t great. Safaricom works in places Airtel doesn’t and vice-versa. Of course because he is a business man it is essential for him to have two phones anyway. 

Maximum security hotel because its next to the Israeli embassy. Lovely room. Hot shower. Sleep. Lala salama. 

Day 2 

Monkey Sunshine Green Tea. What is not to like about that? The coffee is awesome as well.

5 phones, check the charge and try to get on to the wifi network in the hotel to ensure the app is loaded up and ready. The hotel claims to have the fastest internet connection in Nairobi. Does not work at all in my room.

Move to main restaurant area to get 1.2Mbps down and 1.08Mbps up as of writing according to speedtest.net. 

Met my first contact, a photo journalist and Haller person known for her work with NGOs in developing countries who promptly briefed me on what to expect before introducing me to her driver and 2 other chaps, the local minister for agriculture and a freelance journalist for the New York Times, who would be assisting me on my initial few days of research about 160km north of Nairobi, Karatina – the opposite direction to my 3rd destination, Mombasa.

One of those car journeys ensued – one where I was gawping out the window for the most part, where hours breezed by without a thought. 4 hours later and several stop offs in street side towns we arrived armed with our Safaricom sim cards and credit ready for an evening of preparation for (this) tuesday morning. Our first session in a makeshift usability lab somewhere off the beaten track. 

30
Mar
2014

Africa road trip: Day Zero

by Joe Dollar-Smirnov

Haller Trip Kit List

 

This coming Tuesday Red Badger and Haller will be meeting in Africa to begin their 2 week research trip in Kenya.

In a collaboration between Pearlfisher, Red Badger and Haller, a web app has been created to supplement and augment Haller’s already thriving farmer education program based at their farming training centre in Mombasa. We’ll be going out to present it to the farmers and witness first hand how it is received and get their feedback on the work we have done so far.

Week One

We will be focussing on speaking to established farmers of coffee and tea plantations. By holding focus group sessions and creative workshops with the farmers we hope to gain an understanding of their day to day challenges of running an agricultural business, how they learn and pass on knowledge and understand the level of technology and web access they have at their disposal. We have already carried out comprehensive research around the specifics of mobile usage in Kenya, popular phones and likely data packages, however, we anticipate finer detail to emerge while we are on the ground. We’ll follow up the sessions with a presentation and demo of the app followed by some recorded usability testing and post use questions to feed into basic quantitative analysis of the prototype.

Week Two

Travelling from Nairobi to Mombasa to spend time at the farmer training centre will allow us to speak to the villagers who benefit the most from the Haller training initiatives. By spending time with the locals, listening and observing them carrying out day to day tasks we aim to get a broader sense of why the app is important and how we can ensure it serves those users properly. We’ll also run some creative workshops to get some direct cultural input in to the visual design of the app. We’ll be bringing with us all the iconography to present at the training centre and to open up for critique and discussion.

Kit List

  • The prototype (thanks to all the hardwork from Sari, MichelJoe, Albert and Pearlfisher)
  • Testing plan, script and questionnaire
  • Mobile phones (including most popular phones in Kenya)
  • Laptop
  • Separate web cam to record app usage
  • Microphone and camera for usability feedback interviews
  • Paper and Pens

More updates to come.

26
Mar
2014

Oculus Rift, Facebook and Presence in UX

by Joe Dollar-Smirnov

A Link Trainer, a type of flight simulator produced between the early 1930s and early 1950s, in Oak Ridge, in September of 1945. (Ed Westcott/DOE)

A Link Trainer, a type of flight simulator produced between the early 1930s and early 1950s, in Oak Ridge, in September of 1945. (Ed Westcott/DOE)

 

I awoke this morning to see my news feed filling up with articles on Facebook’s latest $2billion acquisition of Oculus Rift.

Forms of Virtual Reality have been around for nearly 100 years already with early examples dating back to flight training simulators for pilots in the first world war. The commercial resurgence in popularity of this obscure medium is thanks, mainly, to the famous Oculus Rift Kickstarter project that sought initial funding of just $250,000. Virtual reality became a reality for bedroom developers and early adopter consumers alike.

Why it’s great for Virtual Reality?

Whatever your feelings are around the announcement (and of Facebook) the scale of investment made will mean the the people at Oculus Rift can get their heads down and focus on R&D without worrying about where their next round of funding is coming from.

This should mean technological advancement that will benefit everyone.

Often, technological advancement in this area is led by academia, government, aerospace and defense industries and until now the kind of people who could spend $2bn on the Research and Development of Virtual Reality were normally associated with large government organisations. For example, Obamas proposed DARPA budget for 2015 is a total of $2.9bn of which $334m dedicated to ICT related activities including various VR projects.

Compare that to the cash injection for the team at Oculus Rift ($400m) and we can start to see that, if allowed to experiment, they can use this as an opportunity to push the medium further than ever before and start looking at ways to make it accessible and useful across a wide variety of applications. Not just gaming and entertainment.

Opportunities in Virtual Reality for UX and visual designers?

As designers we need to understand the challenges of designing for Virtual Reality. Physical, Psychological and cognitive considerations will change the way designers solve problems and evaluate work.

There is no doubt that some of the original Oculus Rift champions will feel alienated by a move that sees it swallowed up by a social networking site but I think (and hope) we will see some very interesting and fairly rapid developments in this area now that are not centred around gaming. With a few companies starting to compete for market dominance including Valve and Sonys Project Morpheus awesome gaming applications are inevitable, however, Facebook is in a position to take advantage of their place as a global platform to explore and develop uses for Virtual Reality that are mainly seen in government or university research labs that reach beyond the social network.

17
Mar
2014

Spree Commerce – The future of E-Commerce?

by Cain Ullah

SpreeCommerce

Red Badger are always looking at new ways of working with regard to our tech stack as well as our process engineering methods. Our focus is on helping our clients to realise benefit as quickly as possible.

We’re always looking at where different business verticals have big problems that need to be solved in a smarter way. One such problem that is obvious is the retail sector. Retailers spend a fortune on implementing their e-commerce stores and are often faced with huge problematic programmes that go-live in a big bang release that goes wrong from day 1.

John Lewis spent £40m on a new e-com platform based on ATG Oracle delivered by Deloitte. This took years to implement with a big bang approach at the end of the project. When like John Lewis, you are selling £1bn revenue online, that is a long time and a lot of expenditure before you can start to realise value.

John Lewis CIO Paul Coby said the new platform is considered to be one of the default choices for his industry. “ATG is one of the two or three standard platforms for e-commerce in retail.”

“It will give us nothing fundamentally different initially, as we want a smooth transition. The aim is to give us another 10 years of upward development. For example, how we then integrate mobile and social media and new search engines into the site is going to be key.”

TEN YEARS!! And £40m that gives you nothing fundamentally different.

Delivering these programmes are also not without big problems. £40m for John Lewis’ website is expensive but at least that was what the programme was budgeted at. Another large (to remain nameless) retailer ended up spending six times their original budget for their e-com re-platform onto IBM Websphere Commerce resulting in a cost to date of several 10′s of £millions.

This is quite typical of retailers. Go with what everyone else is doing, feel the pain and spend the cash (because it is seen as the safe bet). Paul Coby is delivering what will be considered a very successful, large scale project that may also be the perfect solution for their organisation. However, for many scenarios, surely there is a better way to deliver E-com solutions?

New Technology (and Open Source)

In recent years the speed at which technology has improved has been incredible. The development of open source software has been particularly impressive. If you speak to Stu he has some strong thoughts on this. His opinion is that Github has been massively influential in how people develop software and the speed at which it is developing as a result. The biggest companies (Oracle, IBM et al) just cannot adapt at the speed at which the collective open source community can (The speed at which things are moving takes a lot of effort to stay on top of the latest greatest. But, it’s worth it!).

Large companies are starting to embrace open source. Several of our clients have allowed us to develop large enterprise applications for them using a modern open source tech stack (Node.js, Ruby, Elasticsearch, RabbitMQ, LiveScript, Component etc, etc…). Walmart are now building all of their mobile commerce applications based on Node.js. They are making a strategic investment in Node.js transitioning from a core Java solution with a vision of revolutionising retail through technology.

Unfortunately Walmart are certainly an exception to the norm in retail. The attitude of using a “default choice” for e-commerce seems to be the norm and with it comes all of the pain and expense.

An opportunity to do things correctly

We have recently had an opportunity to deliver a complete re-platform of a heritage retailer’s e-commerce website. This retailer had no desire to use a huge monolithic solution such as ATG or Websphere Commerce. However, they have been looking at the following options as serious contenders.

  • Magento – Magento is (was) an open source platform built on old technology (PHP) and is incredibly slow. It has got so large (8.2million lines of code) that it has ground to  a halt. It has only 8 open source contributors. That’s 1 million lines of code per contributor! Back in 2011 it was also acquired by Ebay effectively incorporating it as a standalone venture. Who knows what is going to happen to it but if it is to make a comeback, Ebay will need to invest heavily in product development to effectively rebuild it from the ground up.
  • Hybris – Is a Java based (Spring Framework) and has now been bought by SAP. With SAP focusing on a B2B model and immediately hiking the license fees for Hybris, the future of Hybris as a platform for retailers is unknown.
  • Demandware – Built using DMS Script, a proprietary code base based on JavaScript. If you use Demandware, you have the problem of vendor lock-in as it takes specialist skills to manage and update. It also has a license model of 2-3 % of your website’s annual turnover. Depending on the scale of your business, that can build up to a hell of a lot of money over a long period of time and isn’t great if you have a CAPEX oriented business model like our potential client does.

Looking at these three options, we quickly discounted any of them as a viable platform for the opportunity that was on our table. We wanted to do this right with modern, flexible, scalable technology that could be delivered quickly and cheaper to provide the client with real value, allowing them to realise benefit as soon as possible. The client was open to suggestions. They wanted their new solution to enforce their brand, and bring it firmly into the 21st century by being innovative with both technology and delivery methodology.

As a result, we did lots of research around e-com platforms to see if we could find something that fitted our criteria. During our research we came across Spree Commerce.

Enter Spree Commerce

IMG_3723

Kicking off the Spree Hackathon

Since finding Spree Commerce, we have been delving deeply into research into the platform (we have just finished a 2 day weekend Hackathon building a store from scratch. More about that in future blogs) and like what we see. Spree storefront is a Ruby-on-Rails open source application. Here are some quick at a glance facts about Spree:

  • It has nearly 500 active contributors to the project which puts it in the top 50 open source projects in the world (out of approximately 3 million total)
  • It has approximately 50,000 lines of code to date (about 100 per contributor)
  • There are already over 20,000 stores on the platform globally
  • It’s had over 225,000 downloads
  • Any Ruby developer can modify the software to meet their store’s exact needs — no proprietary programming skills needed
  • The storefront supports responsive web design out of the box for a great user experience. It also has a complete feature set across Product Catalogue, Marketing & Promotions, Payments, Shipping, Site Management & SEO, Checkout, International Features (such as multi-currency) and Analytics & Reporting.

Spree Hub

As well as the Open Source Spree Commerce Storefront, Spree provide a managed software as a service Spree Commerce Hub. This is basically a message broker that integrates seamlessly with the Storefront. It effectively decouples your Storefront from complex back-end integration and automates all of the logistics between the two. There are lots and lots of out of the box back-end integrations and they continue to grow.

The Spree Hub is fully managed as a service by Spree Commerce so comes at a cost, albeit a very competitive cost. It’s also a yearly license fee that is fixed, so the cost doesn’t escalate depending on the scale of your business (like Demandware).

The Storefront combined with Spree Hub is a very, very compelling option for how to deliver e-commerce platforms smarter. After this weekends hackathon, Red Badger will definitely be recommending it as the way forward to our retail clients.

We are excited by how we can approach building enterprise scale e-commerce platforms, delivering them at a fraction of the cost of other platforms and delivering it incredibly quickly using methodologies such as Kanban and Continuous Deployment. We’re looking forward to using Spree to integrate to the likes of Elasticsearch and then contributing our code back into the community. We’ll be using The Spree Hub as a message broker too. Our clients tend to be large and have the usual complex back-end systems such ERP, Logistics and Fullfillment. In general the integration of many systems is made much easier through Spree. The API in the storefront will allow us to easily integrate native apps as well as a responsive web front-end because they will effectively just be consuming the same data.

The proposal is compelling. At Red Badger, we don’t think there is a better E-com solution than Spree Commerce out there at the moment.

Spree Conf

As part of my investigations into Spree Commerce, I went to SpreeConf in New York in late Feb to meet the community, find out more about the platform and see who is using it and how (I’ll write a separate blog summarising the conference itself so will keep this section short).

At the conference I was blown away by the enthusiasm and energy of the community. Everyone is genuinely excited about what they can do on the platform.

Two key speeches that I’ll mention now were from Antonio Silveira – VP Engineering at GoDaddy (who have just announced their partnership with Spree) and Andy Dunn, CEO of Bonobos who are replatforming to Spree but have also just released their female clothing line Ayr.com in just 96 days for first release.

GoDaddy have a platform upon which 60,000 E-com stores are based. This is on an aging tech stack that is getting a complete overhaul. At the core of this will be Spree Commerce. Interestingly, Antonio stated the following as the key factors in GoDaddy choosing Spree Commerce as their platform:

  • Community – Their activity on Github, how many people are committing etc…
  • Quality – Spree’s code base and Ruby-on-Rails being higher quality than the competition
  • Amazing feature set
  • Vision

Andy Dunn shared the vision sentiment. Basing most of his talk on why E-com is bad business he finished by saying “Spree is here to save the day in transforming e-com”.

Both of these are very compelling views from the guys that are already doing it with regard to Spree Commerce.

Summary

As a company Red Badger are very excited about Spree. We feel we can bring something new to the table for our retail clients, both present and future, to help remove the pain and cost that is typically felt by using a “default choice”.

The Spree community is incredible. Spree clients are customising Spree to fix problems they experience (be it an integration to Elasticsearch or a CMS) and with the help of Spree Commerce these are finding their way back into the product. Companies that could be seen as being competitors are sharing their code and their learnings (We had a developer from Rails Dog join us for the Spree Hackathon at our offices this weekend) in true open source fashion. All of this results in a great product, that is improving rapidly and everyone benefits.

Watch this space on updates on our Hackathon and more blogs about our learnings with Spree Commerce.

We have also started a London Spree User Group at our offices to see if we can generate more interest in the open source community here in London and to share war stories.

Hopefully we’ll be delivering our first projects on the platform soon so we’ll keep you updated on that too. We’re looking forward to getting involved with the community, contributing back and making e-commerce a much nicer space to be playing in.

Ref: Spree Commerce Sites

See below for some examples of some nice Spree Commerce sites that are live. I think you’ll agree the user experience and design of these sites are delightful and all are responsive.

14
Mar
2014

Stop, Collaborate and Listen

by Joe Dollar-Smirnov

Albert_Einstein_1979_USSR_Stamp

 

Collaboration is at the heart of innovation.

Newtons law, Einsteins general theory of relativity, the lightbulb and the kitchen sink are all examples of great things that have come out of some sort of collaboration. The benefits of collaboration are clear, yet the actual practice is often underestimated and difficult to get going. Here are 5 top tips for ensuring collaboration flourishes.

 

1/ Humble pie

Everybody’s opinion is valid. Allow people to feel comfortable in the environment to speak up if they feel they can add to an idea and collaborate. Great ideas come from all levels of the organisation from intern to CEO.

 

2/ Peer to peer

Treat everyone with the same level of respect.

 

3/ Brainstorm

Get your ideas out. Vocalise, share, sketch, write and build. Play. Be quick, the more ideas you get out of your brain the sooner you’ll start to see real value and inspiration. Ideas will feed new ideas.

 

4/ Constructive critique

Don’t be rude. The only people who have an excuse to get defensive in a critique are the people who have not had much practice. If you’re an old hand then be the one to accept criticism on the chin and move on. Learn.

 

5/ Dispose of ideas (your own ones!)

If an idea isn’t really working don’t dwell on it. Throw it away. Repeat.

11
Mar
2014

Componentify it

by Alex Savin

ball

Original image by Jonathan Kos-Read, licensed with Creative Commons BY-ND 2.0

One of the most amazing things about Web development is the scale of bicycle inventions. Pretty much every developer at some point decides to conquer a common problem by rewriting it from scratch. Reusing existing solutions is often difficult – they are likely to be tied to internal styles, libraries, build processes. This creates a nurturing ground where same features are implemented many, many times not only because developers think they can do it better, but also because there are too much of efforts to reuse existing solution without bringing up all the dependencies.

This blogpost is about Component approach. Components are designed to have as little dependencies as possible, relying on common web technologies, easily reusable and extendable. With component approach it’s trivial to unplug parts of the app and replace them with other parts or improved version of the same component. There are certain principles to be considered when releasing new components, since the Component itself doesn’t really limit you in any way. It is possible that later on we’ll have some monster components with huge amount of dependencies lurking around – but then again using such components is completely up to you.

When an aspect of a component may be useful to others, consider writing that as a component as well. If it requires reasonable effort to write the code in the first place, chances are someone else could use it too.

Building better components

Make a new component

Component comes with a handy console command. Assuming that you have component npm package installed – try running component help

To create a new component you can run component create mynewcomponent. This command will ask you (quite) a few questions, and then will create a new folder with some basic files.

If you try compiling newly generated component, you might get a following error:

$ component build
error : ENOENT, open '/Users/alex/red-badger/mytestcomponent/template.js'

This happens because there is template.js file specified in component.json, but is not generated by component create. You can either create this file manually, or remove it from component.json. After that component build should generate your first component under /build folder.

Each component can contain any amount of:

  • JS files, with mandatory index.js or main.js file assigned to a “main” directive in component.json
  • CSS files
  • HTML files
  • Local and external dependencies to other components

All assets must be explicitly listed in component.json, otherwise they are ignored. This is another clever feature of Component since the folder might contain temp files, npm packages, generated files. When Component is instructed to pick only certain files, it will not only limit the build to these files, but also fetch them from Github (ignoring everything else you could’ve pushed there on purpose or by accident). This way building components with dependencies becomes much faster.

Component doesn’t have to contain any JavaScript logic, and can simply export HTML template or CSS style:

You can componentify any bit of style, markup or functionality into a reusable package. As long as it makes sense.

Entry point

A fairly common pattern for a UI component is to ask for a DOM selector where the component would insert itself.

You can also use component/dom component for basic DOM manipulations. Dom component is obviously not a jQuery replacement, and lacks lots of jQuery functionality. You can live well also without dom component and manipulate DOM directly with document methods like document.getElementById, document.querySelectorAll. Adding and removing classes to elements can be slightly more challenging without any libraries, but there is a special component for that too – component/classes

Here is an example of component main.js using dom component and appending itself to the target element:

In this case dom component would be specified in the dependencies section of component.json:

Using CoffeeScript, LiveScript, LESS and JADE with components

If you are going to release a public component, it make sense to package it with conventional JS/CSS/HTML files to reduce amount of dependencies and increase reusability. For internal components we’ve mostly used LiveScript, CoffeeScript, LESS styles and JADE templates. And to complete the LiveScript picture, we’ve repackaged Prelude.ls library into a component and used it as dependency.

Notable npm packages for component builds:

In Gruntfile you can configure builder to use extra components:

And in grunt.initConfig section:

Global and local components

When using components extensively in your app you might end up with lots of public and private components. All public components and their dependencies will be automatically installed into a /components folder. It’s a good idea to put your private components into a /local folder next to the /components folder. You can also have dependencies between your local component:

Syntax for local dependencies slightly differs from the global dependencies – there is no version, and you just list all local components as array.

In the root of both /local and /components folders you will need a main component.json file, which will tell the build process which components needs to be included in the final main.js file:

paths will tell the builder to look for /local folder in addition to /components folder.

Later in the html file you simply include this generated main.js.

The content of this file is evaluated, now you can require and start using any of the components on your pages:

Most of Red Badger components include example html file with everything you need to start using that component on your pages.

Check for existing implementation

Before implementing your own component, it might be worth of checking for existing implementation. Most of the components are listed here:

https://github.com/component/component/wiki/Components

Component.io site is another slick way of browsing and searching for existing components. In many cases you’d want to start with existing component and either use it as dependency or fork and add your own functionality. Standard component/components are especially easy starting point for extending.

Conclusion

Component approach requires some extra efforts. You have to abstract and package bits of UI, make them fairly independent and think of the possible reuse cases. You can ignore the new components creation stage completely and just use existing components. We chose to componentify most of our recent World Risk Review project and this proved to be a brilliant decision. Instead of repetitive blocks of styles, markup and logic we managed to put most of the front end elements into components and reuse them when needed. Some of the components were also released as open source, and we hope to see even more useful components released in the future!

4
Mar
2014

Functional refactoring with LiveScript

by Alex Savin

sibelius_pipes
Original image by Dennis Jarvis. Used under Creative Commons BY-SA license.

LiveScript is a functional language that compiles to JavaScript. You could say it’s sort of like CoffeeScript, but in fact it’s so much better. This post features one hands on example of refactoring Javascript-like code using power of LiveScript syntax, combined with Prelude.ls extras.

Here is a function for processing an array of tags into an object for D3.js vizualisation component. On input you have an array like ['tag1', 'tag2', 'tag2', 'tag2', 'tag3', ... ]. The function selects the top 10 of the most popular tags and constructs a D3.js compatible object.

However when I showed this code to Viktor, he was quick to point out that LiveScript can do better. After another couple of minutes he produced this:

8 LOC vs. 21 LOC. The beauty of LiveScript is that it is very readable, and you can figure out what’s going on just by reading the code. Refactored version also compiles into a neat(er) looking JS.

What’s going on here?

|> is a LiveScript directive for piping. You get results from a previous operation and pass it on to the next one. We are effectively processing a single input, so it is piping all the way.

group-by (-> it) — using a Prelude.ls function to create index of the tags array. This will create an object which will look like this: {'tag1': ['tag1'], 'tag2': ['tag2', 'tag2', 'tag2'], ...} We can see a nice example of LiveScript syntax where -> it effectively compiles into:

Note that tags are piped into this function.

obj-to-pairs — another Prelude.ls function, which takes an object and returns a list of pairs. This way our previous array will turn into something like this:

[['tag1', ['tag1']], ['tag2', ['tag2', 'tag2', 'tag2']], ... ]

map (-> [it[0], it[1].length]) — maps every entry in the array using a supplied function. This will produce a new array:

[['tag1', 1], ['tag2', 3], ...]

Again, using the default argument it here for every array entry from the previous array.

sort-by (.1).1 is a clever use of LiveScript syntax to access second entry in a ['tag2', 3] array and sort the master array based on its value. The sort-by function is again provided by the awesome Prelude.ls. Interesting detail here is that (.1) actually compiles into a function:

This means that you can do things like sort-by last, array, which will sort an array of arrays by the last item of each inner array (last being a prelude function again).

reverse — simply reverses the array in order to get top 10 of the most used tags with the next step, which is…

take 10 — takes 10 first entries from an array. It is smart enough to take less entries if the array is not big enough.

And all this leads to the last step:

map (-> {name: it[0], size: it[1]}) — creates a final array of objects with name and size values. Final array will look like this:

[{name: 'tag2', size: 3}, {name: 'tag8', size: 2}, {name: 'tag1', size: 1}, ...]

In LiveScript last result is automatically returned, so there is no need to explicitly return value.

LiveScript is a very powerful language with (mostly) human readable syntax. Combined with Prelude.ls library you can write less code which looks elegant and does so much more.

3
Mar
2014

Farmer training UX research

by Joe Dollar-Smirnov

Image of Haller Farmer Training in action

 

As Cain mentioned in this blog post we are collaborating with Pearlfisher on an app that we hope will bring some very relevant and useful content to those living in rural Africa.

The charity behind the project, Haller, use proven methods to train local farmers and villagers at their demonstration farm (shamba) in Mombassa. This gives locals the power to build essential, sustainable sources of food, medicinal plants and sanitation facilities for themselves and their communities. It is this transfer of knowledge that we aim to emulate through the app. Making this available online for those people who are unable to be physically present for training is a huge challenge.

Physical and environmental limitations present problems we are not used to working with from our cosy office in east London. To fully understand the tech available to our target users, or lack thereof, research is underway in parallel with the development of prototypes we can test when we visit next month.

Watch this space and we’ll share our findings on why this demographic group are leading the way in areas of technology that have not gained traction (yet?) in the ‘Western World’ and how we intend to run the research and usability testing on the ground.