One of the great uncertainties of any business endeavour (and especially when it comes to marketing) is whether an action is feasible. Quantitatively determining the validity of any piece of advertising, content, or online campaign can be especially tricky. Thus, we’ve been severely lucky that the Internet has led to a goldmine of online testing measures. This A/B testing case will show how we helped two Dutch architecture firms put their best foot forward and grow their online presence.
At Promoguy, we employ meticulous measures to ensure that our clients are receiving optimal online marketing services. Comparing A/B variants is a large part of this, as it helps gain insight into targeting needs. However, our process is more complicated than simply tossing two contrasting pieces of marketing content and comparing numbers.
As this online marketing case illustrates, the company saw growth under their goal metrics soon after we implemented new measures. This case follows a series of initiatives on social media platforms (LinkedIn, and Instagram) to test new types of content, perform lead generation campaigns, and boost audience engagement.
As architectural visualisation companies, the Dutch firms would produce rendered 3D photo simulations of various residential, urban, or industrial design projects. Some of these visualisations would be at the behest of clients or design firms, while others were entrants in competitions. These projects could help firms and architects present their designs more pleasingly, portraying the projects as they would appear after construction.
The clients were both based in the Netherlands, but their services extended to other countries as well. Most of their work was local to their native country and those nearby, e.g. Germany or Belgium. Their work was extensive, with one of the clients having operated for over 2 decades by now. They had been making use of a number of our services for quite some time now (as highlighted in previous cases).
Needless to say, they wanted to improve their respective online presences, draw in more clients, and boost engagement with social media outreach. Their main aim was brand-building and creating a cogent, audience-tested version of their online marketing content. Therefore, the companies were both looking for lead generation and increased brand awareness as a means of growing their businesses.
More specifically, they wanted to draw in a very particular type of clientele with each campaign. Certain clients were for lead generations. To this end, one company was looking to target potentially interested architects or people at architecture firms within the Netherlands, i.e. who may want to utilise their services. Similarly, a second campaign was based on a prepared company mailing list.
Our experts also concluded that to achieve these ends, the company needed to infuse content with their own distinct identity and formatting while testing it in different versions. This had to include different texts, images, and pages that were visible to clients.
A/B Testing Process
Our A/B testing processes vary from project to process. As a baseline, we can potentially create AB tests for everything a client may want. This includes location testing audiences, different interests, age, gender, and picture or text favourability testing.
For the uninitiated, A/B testing comprises the use of dual variants of a piece of content to obtain user, audience, or other sorts of performance metrics. These metrics can be favourability, CTR, traffic generation or many others depending on the specifics of the test. For websites, it can be as simple as showing 2 variations of the same page and checking user responses based on visitors’ actions.
Taking the clients’ specific targeting needs into consideration, it seemed like the right method to use. A/B testing is particularly useful in taking the uncertainty out of website optimisation. This makes it a very crucial method in our arsenal for this very reason. It allows for the gathering of easily quantifiable data based on user experiences.
Most companies tend to test 2 variants for simple A and B testing. We take it a step further by amalgamating the best processes in both variants. We implement data with social media pixels (LinkedIn, Instagram, and Facebook depending on the case) and Google Analytics to track more specific user behaviour. By building around the objectives of our research, our analysts further fine-tune content to achieve specific client goals.
This can be things like time spent on a page, pixel implementation stats, specific clicking behaviour on landing pages and beyond, etc. This leads to a more fruitful evaluation of how to move beyond A/B content and to the creation of a content C that makes use of the best elements of both variations. Depending on what the client needs, we can pick and choose elements to chase these metrics more effectively.
The Architecture Firm Online Marketing Analysis
Previously, the firms had been applying content marketing in a slightly haphazard way. The companies have been sharing content (mostly pictures) via social media but the research for such operations has been minimal. For instance, previous content went up with little SEO optimisation and was not A/B tested. Therefore, the infrastructure for this work needed to be built from the ground up.
Our visual designers and copywriting teams began working with several different versions of multiple images and calls to action. This allowed us to narrow it down to the most effective ones, crafting a potent message to attract potential clients or drive engagement. In one campaign, this came in the form of delivering emails with automated and personalised client messages. For the other campaign, this took the form of social media posts.
The main channels they were using were Instagram and LinkedIn. Both of these proved to be fertile ground for 2 different types of A/B tests: one for informal content that leads to audience engagement, and the other for lead generation from potential clients using InMail. These also match the particular needs of each campaign.
With the variations created, we were now ready to test their effectiveness. We found that InMail ads and messages work particularly well. The ads and personalised messages ran for 2 to 3 days, gathering sufficient data. Then, the analysis period began.
Emerging Results & Adjustments
The early results were very promising, but there was more work ahead. Analysing the versions and judging performance, we tinkered with the formats of each piece of content to produce more robust marketing materials. As a consequence, engagement went exponentially higher after the implementation of these changes.
As mentioned previously, one of the major issues with hiring a marketing firm or implementing any marketing strategy is viability. After A/B testing and tinkering with the data, we came up with a refined approach and had the results to show for it. The LinkedIn re-marketing campaign obtained nearly 2 dozen potential leads within a very short time.
Impressions and reach for both campaigns were at an all-time high. Additionally, conversions were climbing while costs per conversion remained stable. This showed that the clients were getting their value for many and more. At the most optimal rate, we were obtaining 0.04 per engagement.
As the case study illustrates, it isn’t enough to test 2 variants, but rather to evaluate and re-synthesise from user data. Most companies lead with 2 variants, check superficial data metrics and pick one of the two as they stand. In the case of Promoguy, we evaluate and retool options using deeper data strands beyond simply CTRs and engagement. We also maximise cost-effectiveness and emphasise procedural efficiency.