Luckyjoyspins https://luckyjoyspins.org/ Where facts lead, opinions follow Fri, 09 Aug 2024 12:46:36 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 How Our Website Conversion Strategy Increased Business Inquiries by 37% https://luckyjoyspins.org/how-our-website-conversion-strategy-increased-business-inquiries-by-37/ https://luckyjoyspins.org/how-our-website-conversion-strategy-increased-business-inquiries-by-37/#respond Fri, 09 Aug 2024 12:46:36 +0000 https://luckyjoyspins.org/?p=72404

Having a website that doesn’t convert is a little like having a bucket with a hole in it. Do you keep filling it up while the water’s pouring out — or do you fix the hole then add water? In other words, do you channel your budget into attracting people who are “pouring” through without taking action, or do you fine-tune your website so it’s appealing enough for them to stick around?

Our recommendation? Optimize the conversion rate of your website, before you spend on increasing your traffic to it.

Here’s a web design statistic to bear in mind: you have 50 milliseconds to make a good first impression. If your site’s too slow, or unattractive, or the wording isn’t clear, they’ll bounce faster than you can say “leaky bucket”. Which is a shame, because you’ve put lots of effort into designing a beautiful product page and About Us, and people just aren’t getting to see it.

As a digital web design and conversion agency in Melbourne, Australia, we’ve been helping our customers optimize their websites for over 10 years, but it wasn’t until mid-2019 that we decided to turn the tables and take a look at our own site.

As it turned out, we had a bit of a leaky bucket situation of our own: while our traffic was good and conversions were okay, there was definitely room for improvement.

In this article, I’m going to talk a little more about conversions: what they are, why they matter, and how they help your business. I’ll then share how I made lots of little tweaks that cumulatively led to my business attracting a higher tier of customers, more inquiries, plus over $780,000 worth of new sales opportunities within the first 26 weeks of making some of those changes. Let’s get into it!

What is conversion?

Your conversion rate is a figure that represents the percentage of visitors who come to your site and take the desired action, e.g. subscribing to your newsletter, booking a demo, purchasing a product, and so on.

Conversions come in all shapes and sizes, depending on what your website does. If you sell a product, making a sale would be your primary goal (aka a macro-conversion). If you run, say, a tour company or media outlet, then subscribing or booking a consultation might be your primary goal.

If your visitor isn’t quite ready to make a purchase or book a consultation, they might take an intermediary step — like signing up to your free newsletter, or following you on social media. This is what’s known as a micro-conversion: a little step that leads towards (hopefully) a bigger one.

A quick recap

A conversion can apply to any number of actions — from making a purchase, to following on social media.

Macro-conversions are those we usually associate with sales: a phone call, an email, or a trip to the checkout. These happen when the customer has done their research and is ready to leap in with a purchase. If you picture the classic conversion funnel, they’re already at the bottom.

Conversion funnel showing paying clients at the bottom.

Micro-conversions, on the other hand, are small steps that lead toward a sale. They’re not the ultimate win, but they’re a step in the right direction.

Most sites and apps have multiple conversion goals, each with its own conversion rate.

Micro-conversions vs. macro-conversions: which is better?

The short answer? Both. Ideally, you want micro- and macro-conversions to be happening all the time so you have a continual flow of customers working their way through your sales funnel. If you have neither, then your website is behaving like a leaky bucket.

Here are two common issues that seem like good things, but ultimately lead to problems:

High web traffic (good thing) but no micro- or macro-conversions (bad thing — leaky bucket alert)

High web traffic (good thing) plenty of micro-conversions (good thing), but no macro conversions (bad thing)

A lot of businesses spend heaps of money making sure their employees work efficiently, but less of the budget goes into what is actually one of your best marketing tools: your website.

Spending money on marketing will always be a good thing. Getting customers to your site means more eyes on your business — but when your website doesn’t convert visitors into sales, that’s when you’re wasting your marketing dollars. When it comes to conversion rate statistics, one of the biggest eye-openers I read was this: the average user’s attention span has dropped from 12 to a mere 7 seconds. That’s how long you’ve got to impress before they bail — so you’d better make sure your website is fast, clear, and attractive.

Our problem

Our phone wasn’t ringing as much as we’d have liked, despite spending plenty of dollars on SEO and Adwords. We looked into our analytics and realized traffic wasn’t an issue: a decent number of people were visiting our site, but too few were taking action — i.e. inquiring. Here’s where some of our issues lay:

Our site wasn’t as fast as it could have been (anything with a load time of two seconds or over is considered slow. Ours was hovering around 5-6, and that was having a negative impact on conversions).

Our CTA conversions were low (people weren’t clicking — or they were dropping off because the CTA wasn’t where it needed to be).

We were relying on guesswork for some of our design decisions — which meant we had no way of measuring what worked, and what didn’t.

In general, things were good but not great. Or in other words, there was room for improvement.

What we did to fix it

Improving your site’s conversions isn’t a one-size-fits all thing — which means what works for one person might not work for you. It’s a gradual journey of trying different things out and building up successes over time. We knew this having worked on hundreds of client websites over the years, so we went into our own redesign with this in mind. Here are some of the steps we took that had an impact.

We decided to improve our site

First of all, we decided to fix our company website. This sounds like an obvious one, but how many times have you thought “I’ll do this really important thing”, then never gotten round to it. Or rushed ahead in excitement, made a few tweaks yourself, then let your efforts grind to a halt because other things took precedence?

This is an all-too-common problem when you run a business and things are just… okay. Often there’s no real drive to fix things and we fall back into doing what seems more pressing: selling, talking to customers, and running the business.

Deciding you want to improve your site’s conversions starts with a decision that involves you and everyone else in the company, and that’s what we did. We got the design and analytics experts involved. We invested time and money into the project, which made it feel substantial. We even made EDMs to announce the site launch (like the one below) to let everyone know what we’d been up to. In short, we made it feel like an event.

Graphic showing hummingbird flying in front of desktop monitor with text

We got to know our users

There are many different types of user: some are ready to buy, some are just doing some window shopping. Knowing what type of person visits your site will help you create something that caters to their needs.

We looked at our analytics data and discovered visitors to our site were a bit of both, but tended to be more ready to buy than not. This meant we needed to focus on getting macro-conversions — in other words, make our site geared towards sales — while not overlooking the visitors doing some initial research. For those users, we implemented a blog as a way to improve our SEO, educate leads, and build up our reputation.

User insight can also help you shape the feel of your site. We discovered that the marketing managers we were targeting at the time were predominantly women, and that certain images and colours resonated better among that specific demographic. We didn’t go for the (obvious pictures of the team or our offices), instead relying on data and the psychology of attraction to delve into the mind of the users.

Chromatix website home page showing a bright pink flower and text.
Chromatix web page showing orange hummingbird and an orange flower.We improved site speed

Sending visitors to good sites with bad speeds erodes trust and sends them running. Multiple studies show that site speed matters when it comes to conversion rates. It’s one of the top SEO ranking factors, and a big factor when it comes to user experience: pages that load in under a second convert around 2.5 times higher than pages taking five seconds or more.

Bar chart showing correlation between fast loading pages and a higher conversion rate.

We built our website for speed. Moz has a great guide on page speed best practices, and from that list, we did the following things:

We optimized images.

We managed our own caching.

We compressed our files.

We improved page load times (Moz has another great article about how to speed up time to first Byte). A good web page load time is considered to be anything under two seconds — which we achieved.

In addition, we also customized our own hosting to make our site faster.

We introduced more tracking

As well as making our site faster, we introduced a lot more tracking. That allowed us to refine our content, our messaging, the structure of the site, and so on, which continually adds to the conversion.

We used Google Optimize to run A/B tests across a variety of things to understand how people interacted with our site. Here are some of the tweaks we made that had a positive impact:

Social proofing can be a really effective tool if used correctly, so we added some stats to our landing page copy.

Google Analytics showed us visitors were reaching certain pages and not knowing quite where to go next, so we added CTAs that used active language. So instead of saying, “If you’d like to find out more, let us know”, we said “Get a quote”, along with two options for getting in touch.

We spent an entire month testing four words on our homepage. We actually failed (the words didn’t have a positive impact), but it allowed us to test our hypothesis. We did small tweaks and tests like this all over the site.

Analytics data showing conversion rates.

We used heat mapping to see where visitors were clicking, and which words caught their eye. With this data, we knew where to place buttons and key messaging.

We looked into user behavior

Understanding your visitor is always a good place to start, and there are two ways to go about this:

Quantitative research (numbers and data-based research)

Qualitative research (people-based research)

We did a mixture of both.

For the quantitative research, we used Google Analytics, Google Optimize, and Hotjar to get an in-depth, numbers-based look at how people were interacting with our site.

Heat-mapping software, Hotjar, showing how people click and scroll through a page.

Heat-mapping software shows how people click and scroll through a page. Hot spots indicate places where people naturally gravitate.

We could see where people were coming into our site (which pages they landed on first), what channel brought them there, which features they were engaging with, how long they spent on each page, and where they abandoned the site.

For the qualitative research, we focused primarily on interviews.

We asked customers what they thought about certain CTAs (whether they worked or not, and why).

We made messaging changes and asked customers and suppliers whether they made sense.

We invited a psychologist into the office and asked them what they thought about our design.

What we learned

We found out our design was good, but our CTAs weren’t quite hitting the mark. For example, one CTA only gave the reader the option to call. But, as one of our interviewees pointed out, not everyone likes using the phone — so we added an email address.

We were intentional but ad hoc about our asking process. This worked for us — but you might want to be a bit more formal about your approach (Moz has a great practical guide to conducting qualitative usability testing if you’re after a more in-depth look).

The results

Combined, these minor tweaks had a mighty impact. There’s a big difference in how our site looks and how we rank. The bottom line: after the rebuild, we got more work, and the business did much better. Here are some of the gains we’ve seen over the past two years.

Pingdom website speed test for Chromatix.

Our dwell time increased by 73%, going from 1.5 to 2.5 minutes.

We received four-times more inquiries by email and phone.

Our organic traffic increased despite us not channeling more funds into PPC ads.

Graph showing an increase in organic traffic from January 2016 to January 2020.
Graph showing changes in PPC ad spend over time.

We also realized our clients were bigger, paying on average 2.5 times more for jobs: in mid-2018, our average cost-per-job was $8,000. Now, it’s $17,000.

Our client brand names became more recognizable, household names — including two of Australia’s top universities, and a well-known manufacturing/production brand.

Within the first 26 weeks, we got over $770,000 worth of sales opportunities (if we’d accepted every job that came our way).

Our prospects began asking to work with us, rather than us having to persuade them to give us the business.

We started getting higher quality inquiries — warmer leads who had more intent to buy.

Some practical changes you can make to improve your website conversions

When it comes to website changes, it’s important to remember that what works for one person might not work for you.

We’ve used site speed boosters for our clients before and gotten really great results. At other times, we’ve tried it and it just broke the website. This is why it’s so important to measure as you go, use what works for your individual needs, and remember that “failures” are just as helpful as wins.

Below are some tips — some of which we did on our own site, others are things we’ve done for others.

Tip number 1: Get stronger hosting that allows you to consider things like CDNs. Hiring a developer should always be your top choice, but it’s not always possible to have that luxury. In this instance, we recommend considering CDNs, and depending on the build of your site, paying for tools like NitroPack which can help with caching and compression for faster site speeds.

Tip number 2: Focus your time. Identify top landing pages with Moz Pro and channel your efforts in these places as a priority. Use the 80/20 principle and put your attention on the 20% that gets you 80% of your success.

Tip number 3: Run A/B tests using Google Optimize to test various hypotheses and ideas (Moz has a really handy guide for running split tests using Google). Don’t be afraid of the results — failures can help confirm that what you are currently doing right. You can also access some in-depth data about your site’s performance in Google Lighthouse.

Site performance data in Google Lighthouse.

Tip number 4: Trial various messages in Google Ads (as a way of testing targeted messaging). Google provides many keyword suggestions on trending words and phrases that are worth considering.

Tip number 5: Combine qualitative and quantitative research to get to know how your users interact with your site — and keep testing on an ongoing basis.

Tip number 6: Don’t get too hung up on charts going up, or figures turning orange: do what works for you. If adding a video to your homepage slows it down a little but has an overall positive effect on your conversion, then it’s worth the tradeoff.

Tip number 7: Prioritize the needs of your target customers and focus every build and design choice around them.

Recommended tools

Nitropack: speed up your site if you’ve not built it for speed from the beginning.

Google Optimize: run A/B tests

HotJar: see how people use your site via heat mapping and behaviour analytics.

Pingdom / GTMetrix: measure site speed (both is better if you want to make sure you meet everyone’s requirements).

Google Analytics: find drop-off points, track conversion, A/B test, set goals.

Qualaroo: poll your visitors while they are on your site with a popup window.

Google Consumer Surveys: create a survey, Google recruits the participants and provides results and analysis.

Moz Pro: Identify top landing pages when you connect this tool to your Google Analytics profile to create custom reports.

How to keep your conversion rates high

Treat your website like your car. Regular little tweaks to keep it purring, occasional deeper inspections to make sure there are no problems lurking just out of sight. Here’s what we do:

We look at Google Analytics monthly. It helps to understand what’s working, and what’s not.

We use goal tracking in GA to keep things moving in the right direction.

We use Pingdom’s free service to monitor the availability and response time of our site.

We regularly ask people what they think about the site and its messaging (keeping the qualitative research coming in).

Conclusion

Spending money on marketing is a good thing, but when you don’t have a good conversion rate, that’s when your website’s behaving like a leaky bucket. Your website is one of your strongest sales tools, so it really does pay to make sure it’s working at peak performance.

I’ve shared a few of my favorite tools and techniques, but above all, my one bit of advice is to consider your own requirements. You can improve your site speed if you remove all tags and keep it plain. But that’s not what you want: it’s finding the balance between creativity and performance, and that will always depend on what’s important.

For us as a design agency, we need a site that’s beautiful and creative. Yes, having a moving background on our homepage slows it down a little bit, but it improves our conversions overall.

The bottom line: Consider your unique users, and make sure your website is in line with the goals of whoever you’re speaking with.

We can do all we want to please Google, but when it comes to sales and leads, it means more to have a higher converting and more effective website. We did well in inquiries (actual phone calls and email leads) despite a rapid increase in site performance requirements from Google. This only comes down to one thing: having a site customer conversion framework that’s effective.

]]>
https://luckyjoyspins.org/how-our-website-conversion-strategy-increased-business-inquiries-by-37/feed/ 0
How to leverage the 80/20 rule for martech efficiency and ROI https://luckyjoyspins.org/how-to-leverage-the-80-20-rule-for-martech-efficiency-and-roi/ https://luckyjoyspins.org/how-to-leverage-the-80-20-rule-for-martech-efficiency-and-roi/#respond Fri, 09 Aug 2024 12:44:28 +0000 https://luckyjoyspins.org/?p=72401

CMOs and their marketing organizations face a Sisyphean task in managing their ever-expanding martech stack. As these ecosystems grow bloated and more complex, challenges related to integration, utilization and ROI compound, becoming significant blockers to productivity.

In the first part of this two-part guide, we’ll explore these challenges and introduce a powerful solution for your consideration: the Pareto principle, also known as the 80/20 rule. Understanding this principle and how it applies to martech management will better equip you to identify the tools and strategies driving your marketing success.

The insights below will help you evaluate the current martech stack and identify the 20% of tools that generate 80% of your results, setting the stage for optimizing your martech operations.

Navigating the challenges of the martech maze

As a CMO wrestling with your martech stack, you’re likely encountering several key issues:

Data integration and insight generation: Many marketing organizations struggle to create a cohesive ecosystem where martech tools effectively share data. This lack of integration hampers the development of a unified customer view, making it challenging to gather and extract actionable insights.

Underutilization of existing tools: Marketers use only 33% of their martech stack’s capabilities, down from 58% in 2020, according to Gartner’s 2023 Marketing Technology Survey. This underutilization often arises from inadequate training and poor purchasing decisions.

ROI and cost concerns: Justifying the significant investment in martech is challenging, especially when existing tools are underutilized. Poor adoption and lack of use lead to poor ROI, causing further investment or adoption hesitation.

Adapting to emerging technologies: Keeping pace with rapidly evolving technologies like AI and predictive analytics is an ongoing challenge. CMOs need to balance staying current with innovations while maintaining efficiency with existing systems.

Skills gap in marketing teams: Many CMOs express difficulty finding and training staff who can effectively implement and manage complex martech solutions. This creates a noticeable skills gap within marketing organizations.

Addressing these challenges requires a strategic approach, improving integration, enhancing team capabilities, managing costs and staying current with technological advancements. This is where the Pareto principle can become a powerful strategic tool for you and your team.

Dig deeper: Unlock marketing efficiency: The essential guide to martech stack optimization

Understanding the Pareto principle

Vilfredo Pareto, born in Paris in 1848, was an Italian engineer, economist and sociologist. Pareto began his career as a civil engineer before moving into economics and sociology.

His observations of wealth distribution in Italy led to the 80/20 rule, stating that 80% of effects come from 20% of causes. This principle, known as the Pareto distribution, has been widely recognized and applied beyond economics to various fields, including business management, quality control and software engineering.

Are you getting the most from your stack? Take our brief 2024 MarTech Replacement Survey

It can be a game changer when it comes to marketing technology stacks. You can optimize your marketing technology stacks and resource allocation by focusing on the 20% of tools, strategies or efforts that generate 80% of the results. This targeted approach enhances efficiency, reduces costs and improves overall outcomes, demonstrating the principle’s transformative potential in strategic decision-making.

Applying the Pareto principle to martech management

To apply the 80/20 rule to your martech stack, start with a comprehensive audit of your existing martech tools and their impact:

Inventory assessment: Catalog all tools and platforms currently in use.

Usage analysis: Determine how frequently and by whom each tool is used. Pull user logs for each tool. If it’s not used weekly, flag it.

Performance metrics: Evaluate each tool’s effectiveness based on key performance indicators (KPIs) such as lead generation, conversion rates and customer engagement. (Your mileage may vary, so make sure you use KPIs that are right for your business.)

Cost-benefit analysis: Assess each tool’s cost relative to its contribution to overall marketing goals. Calculate each tool’s ROI: (Revenue Generated – Total Cost) / Total Cost. Then, rank them.

User feedback: Survey your team. Ask questions like: “If you could only keep 5 tools, which would they be?”

Integration assessment: Map how your tools connect. Isolated tools are red flags.

After the audit, pinpoint the 20% of tools driving 80% of the value. Look for tools that:

Drive multiple KPIs: For example, a CRM that improves lead quality, shortens sales cycles and boosts customer retention.

Have high user adoption: Tools consistently used by >80% of their intended users.

Provide unique, critical data: This tool might offer exclusive audience insights you can’t get elsewhere.

Scale effortlessly: Tools that maintain or improve ROI as your company grows.

Align closely with strategic objectives: Ensure the tools support broader business goals.

Integrate seamlessly: Tools that work well with your stack’s other high-value tools/platforms.

By identifying and focusing on these high-impact tools, you’ll be well-positioned to optimize your martech stack using the 80/20 rule. In Part 2, we’ll explore specific strategies for implementing this approach and maximizing the efficiency of your marketing technology investments.

Dig deeper: Aligning martech with your business strategy: Your blueprint for success

Email:

Business email address

Sign me up!
   Processing…

See terms.

The post How to leverage the 80/20 rule for martech efficiency and ROI appeared first on MarTech.

]]>
https://luckyjoyspins.org/how-to-leverage-the-80-20-rule-for-martech-efficiency-and-roi/feed/ 0
The 6 Best AI Content Checkers To Use In 2024 via @sejournal, @annabellenyst https://luckyjoyspins.org/the-6-best-ai-content-checkers-to-use-in-2024-via-sejournal-annabellenyst/ https://luckyjoyspins.org/the-6-best-ai-content-checkers-to-use-in-2024-via-sejournal-annabellenyst/#respond Fri, 09 Aug 2024 12:43:07 +0000 https://luckyjoyspins.org/?p=72398

Today, many people see generative AI like ChatGPT, Gemini, and others as indispensable tools that streamline their day-to-day workflows and enhance their productivity.

However, with the proliferation of AI assistants comes an uptick in AI-generated content. AI content detectors can help you prioritize content quality and originality.

These tools can help you discern whether a piece of content was written by a human or AI – a task that’s becoming increasingly difficult – and this can help detect plagiarism, and ensure content is original, unique, and high-quality.

In this article, we’ll look at some of the top AI content checkers available in 2024. Let’s dive in.

The 6 Best AI Content Checkers
1. GPTZero

Launched in 2022, GPTZero was “the first public open AI detector,” according to its website – and it’s a leading choice among the tools out there today.

GPTZero’s advanced detection model comprises seven different components, including an internet text search to identify whether the content already exists in internet archives, a burstiness analysis to see whether the style and tone reflect that of human writing, end-to-end deep learning, and more.

Its Deep Scan feature gives you a detailed report highlighting sentences likely created by AI and tells you why that is, and GPTZero also offers a user-friendly Detection Dashboard as a source of truth for all your reports.

The tool is straightforward, and the company works with partners and researchers from institutions like Princeton, Penn State, and OpenAI to provide top-tier research and benchmarking.

Cost:

The Basic plan is available for free. It includes up to 10,000 words per month.
The Essential plan starts at $10 per month, with up to 150,000 words, plagiarism detection, and advanced writing feedback.
The Premium plan starts at $16 per month and includes up to 300,000 words, everything in the Essential tier, as well as Deep Scan, AI detection in multiple languages, and downloadable reports.

2. Originality.ai

Originality.ai is designed to detect AI-generated content across various language models, including ChatGPT, GPT-4o, Gemini Pro, Claude 3, Llama 3, and others. It bills itself as the “most accurate AI detector,” and targets publishers, agencies, and writers – but not students.

The latter is relevant because, the company says, by leaving academia, research, and other historical text out of its scope, it’s able to better train its model to hone in on published content across the internet, print, etc.

Originality.ai works across multiple languages and offers a free Chrome extension and API integration. It also has a team that works around the clock, testing out new strategies to create AI content that tools can’t detect. Once it finds one, it trains the tool to sniff it out.

The tool is straightforward; users can just paste content directly into Originality.ai, or upload from a file or even a URL. It will then give you a report that flags AI-detected portions as well as the overall originality of the text. You get three free scans initially, with a 300-word limit.

Cost:

Pro membership starts at $12.45 per month and includes 2,000 credits, AI scans, shareable reports, plagiarism and readability scans, and more.
Enterprise membership starts at $179 per month and includes 15,000 credits per month, features in the Pro plan, as well as priority support, API, and a 365-day history of your scans.
Originality.ai also offers a “pay as you go” tier, which consists of a $30 one-time payment to access 3,000 credits and some of the more limited features listed above.

3. Copyleaks

While you’ve probably heard of Copyleaks as a plagiarism detection tool, what you might not know is that it also offers a comprehensive AI-checking solution.

The tool covers 30 languages and detects across AI models including ChatGPT, Gemini, and Claude – and it automatically updates when new language models are released.

According to Copyleaks, its AI detector “has over 99% overall accuracy and a 0.2% false positive rate, the lowest of any platform.”

It works by using its long history of data and learning to spot the pattern of human-generated writing – and thus, flag anything that doesn’t fit common patterns as potentially AI-generated.

Other notable features of Copyleaks’ AI content detector are the ability to detect AI-generated source code, spot content that might have been paraphrased by AI, as well as browser extension and API offerings.

Cost:

Users with a Copyleaks account can access a limited number of free scans daily.
Paid plans start at $7.99 per month for the AI Detector tool, including up to 1,200 credits, scanning in over 30 languages, two users, and API access.
You can also get access to an AI + Plagiarism Detection tier starting at $13.99 per month.

4. Winston AI

Another popular AI content detection tool, Winston AI calls itself “the most trusted AI detector,” and claims to be the only such tool with a 99.98% accuracy rate.

Winston AI is designed for users across the education, SEO, and writing industries, and it’s able to identify content generated by LLMs such as ChatGPT, GPT-4, Google Gemini, Claude, and more.

Using Winston AI is easy; paste or upload your documents into the tool, and it will scan the text (including text from scanned pictures or handwriting) and provide a printable report with your results.

Like other tools in this list, Winston AI offers multilingual support, high-grade security, and can also spot content that’s been paraphrased using tools like Quillbot.

One unique feature of Winston AI is its “AI Prediction Map,” a color-coded visualization that highlights which parts of your content sound inauthentic and may be flagged by AI detectors.

Cost

Free 7-day trial includes 2,000 credits, AI content checking, AI image and deepfake detection, and more.
Paid plans start at $12 per month for 80,000 credits, with additional advanced features based on your membership tier.

5. TraceGPT

Looking for an extremely accurate AI content detector? Try TraceGPT by PlagiarismCheck.org.

It’s a user-friendly tool that allows you to upload files across a range of formats, including doc, docx, txt, odt, rtf, and pdf. Then, it leverages creativity/predictability ratios and other methods to scan your content for “AI-related breadcrumbs.”

Once it’s done, TraceGPT will provide results that show you what it has flagged as potential AI-generated text, tagging it as “likely” or “highly likely.”

As with many of the options here, TraceGPT offers support in several languages, as well as API and browser extension access. The tool claims to be beneficial for people in academia, SEO, and recruitment.

Cost

You can sign up to use TraceGPT and will be given limited free access.
Paid plans differ based on the type of membership; for businesses, they start at $69 for 1,000 pages, and for individuals, it starts at $5.99 for 20 pages. Paid plans also give you access to 24/7 support and a grammar checker.

6. Hive Moderation

Hive Moderation, a company that specializes in content moderation, offers an AI content detector with a unique differentiator. Unlike most of the other examples listed here, it is capable of checking for AI content across several media formats, including text, audio, and image.

Users can simply input their desired media, and Hive’s models will discern whether they believe them to be AI-generated. You’ll get immediate results with a holistic score and more detailed information, such as whether Hive thinks your image was created by Midjourney, DALL-E, or ChatGPT, for example.

Hive Moderation offers a Chrome extension for its AI detector, as well as several levels of customization so that customers can tweak their usage to fit their needs and industry.

Pricing:

You can download the Hive AI Chrome Extension for free, and its browser tool offers at least some free scans.
You’ll need to contact the Hive Moderation team for more extensive use of its tools.

What Is An AI Content Checker?

An AI content checker is a tool for detecting whether a piece of content or writing was generated by artificial intelligence.

Using machine learning algorithms and natural language processing, these tools can identify specific patterns and characteristics common in AI-generated content.

An important disclaimer: At this point in time, no AI content detector is perfect. While some are better than others, they all have limitations.

They can make mistakes, from falsely identifying human-written content as AI-generated or failing to spot AI-generated content.

However, they are useful tools for pressure-testing content to spot glaring errors and ensure that it is authentic and not a reproduction or plagiarism.

Why Use An AI Content Detector?

As AI systems become more widespread and sophisticated, it’ll only become harder to tell when AI has produced content – so tools like these could become more important.

Other reasons AI content checkers are beneficial include:

They can help you protect your reputation. Say you’re publishing content on a website or blog. You want to make sure your audience can trust that what they’re reading is authentic and original. AI content checkers can help you ensure just that.
They can ensure you avoid any plagiarism. Yes, generative AI is only getting better, but it’s still known to reproduce other people’s work without citation in the answers it generates. So, by using an AI content detector, you can steer clear of plagiarism and the many risks associated with it.
They can confirm that the content you’re working with is original. Producing unique content isn’t just an SEO best practice – it’s essential to maintaining integrity, whether you’re a business, a content creator, or an academic professional. AI content detectors can help here by weeding out anything that doesn’t meet that standard.

AI content detectors have various use cases, including at the draft stage, during editing, or during the final review of content. They can also be used for ongoing content audits.

AI detectors may produce false positives, so you should scrutinize their results if you’re using them to make a decision. However, false positives can also help identify human-written content that requires a little more work to stand out.

We recommend you use a variety of different tools, cross-check your results, and build trust with your writers. Always remember that these are not a replacement for human editing, fact-checking, or review.

They are merely there as a helping hand and an additional level of scrutiny.

In Summary

While we still have a long way to go before AI detection tools are perfect, they’re useful tools that can help you ensure your content is authentic and of the highest quality.

By making use of AI content checkers, you can maintain trust with your audience and ensure you stay one step ahead of the competition.

Hopefully, this list of the best solutions available today can help you get started. Choose the tool that best fits your resources and requirements, and start integrating AI detection into your content workflow today.

More resources: 

Featured Image: Sammby/Shutterstock

]]>
https://luckyjoyspins.org/the-6-best-ai-content-checkers-to-use-in-2024-via-sejournal-annabellenyst/feed/ 0
AI-powered martech news and releases: August 8 https://luckyjoyspins.org/ai-powered-martech-news-and-releases-august-8/ https://luckyjoyspins.org/ai-powered-martech-news-and-releases-august-8/#respond Fri, 09 Aug 2024 12:40:25 +0000 https://luckyjoyspins.org/?p=72395

Does your consumer product have AI? Do yourself a favor and don’t tell anyone. 

A study of 1,000 people found products described as having AI were consistently less popular than those that weren’t.

“When AI is mentioned, it tends to lower emotional trust, which in turn decreases purchase intentions,” said lead author and Washington State University clinical assistant professor of marketing Mesut Cicek in a statement. “We found emotional trust plays a critical role in how consumers perceive AI-powered products.”

For example, researchers found people were far less likely to purchase a smart television when its description included “artificial intelligence.” A separate group was far more likely to buy it when the words were omitted from an identical description.

The effect was even greater with high-risk/big-ticket purchases like expensive electronics or medical devices. Researchers said this could be because consumers are more wary of monetary loss or danger to physical safety.

“Marketers should carefully consider how they present AI in their product descriptions or develop strategies to increase emotional trust,” said Cicek. “Emphasizing AI may not always be beneficial, particularly for high-risk products. Focus on describing the features or benefits and avoid the AI buzzwords.”

The study was published in the Journal of Hospitality Marketing & Management.

And now, here’s this week’s AI-powered martech news and releases:

Optimove has partnered with Captain Up and Gamanza Engage to create AI-driven gamification solutions. This allows for the integration of gamification features into Optimove’s marketing platform, enhancing customer engagement through real-time event synchronization and personalized interactions. It also supports comprehensive campaign measurement, enabling operators to evaluate the effectiveness of their gamification strategies.

OLIVER’s generative AI tool Slipstream lets users create complete creative briefs to get better results from their agency partner. It checks the initial brief — whether a formal document or a hurried email – for all the key components, such as budgets, timings, target audiences and objectives. The user can then add any missing items. Once that is done, Slipstream puts the brief into a template format and checks to see if it reflects the client’s priorities and business objectives — anything from the need to focus on sustainability or DEI to the brand’s tone of voice and distinctive brand assets.

Webgility’s Webgility AI Assistant is designed to assist small and medium-sized businesses (SMBs) automate their e-commerce workflows. This AI-powered chat assistant utilizes natural language processing to provide quick answers and perform tasks like posting orders to QuickBooks. The feature aims to streamline access to business analytics, helping users manage orders, refunds, and product information efficiently.

ShareThis’ new contextual targeting solution uses AI to analyze web page content for ad placements. It uses large language models and consumer behavior insights to enhance URL classification and contextual targeting without relying on cookies. This allows advertisers to improve the relevance and effectiveness of their ad placements.

Adobe’s Adobe Journey Optimizer (AJO) B2B Edition uses generative AI to create targeted buying groups and personalized customer journeys across various channels. It also assists in creating tailored content and enhances collaboration between sales and marketing teams.

Coveo and Optimizely are working to enhance AI-powered search capabilities across digital platforms. This collaboration focuses on providing personalized user experiences through Coveo’s generative answering solution and advanced search functionalities. The integration aims to improve content discoverability and streamline access to information, allowing businesses to deliver tailored experiences efficiently.

Dstillery integrated its audience solutions with the AI-powered curation platform Onetag. This allows brands and agencies using the Onetag Smart Curation platform to access Dstillery’s pre-built models through private marketplaces. These models utilize ID-free technology, which predicts the value of an ad impression without identifying the user, offering a privacy-safe solution for advertisers.

Consumr.ai’s Audience in Motion helps brands predict audience movements using deterministic data. This data-driven approach provides real-time intelligence, aiding strategic planning while complying with GDPR and CCPA. The platform offers quick implementation and delivers detailed insights into consumer behavior, enabling brands to make informed decisions.

Marketeam.ai’s Maya is an AI Brand Analyst designed to analyze real-time data on brand and market trends. It generates marketing opportunities and plans while working with other AI agents to enhance performance across various channels, providing a comprehensive solution for marketing analysis and strategy.

PMG’s marketing platform Alli added two AI features to improve campaign planning and creative insights. The Audience Planner allows precise audience targeting by integrating first-party and walled garden data, while Creative Insights analyzes ad performance across channels. 

Wondercraft has unveiled the world’s first AI “Ad Studio,” which simplifies audio ad production for creative teams. This platform allows users to produce and iterate studio-quality audio ads quickly by typing, streamlining the creative process for agencies and brands. 

The post AI-powered martech news and releases: August 8 appeared first on MarTech.

]]>
https://luckyjoyspins.org/ai-powered-martech-news-and-releases-august-8/feed/ 0
Cannibalization https://luckyjoyspins.org/cannibalization/ https://luckyjoyspins.org/cannibalization/#respond Fri, 09 Aug 2024 12:38:11 +0000 https://luckyjoyspins.org/?p=72393

In today’s episode of Whiteboard Friday, Tom Capper walks you through a problem many SEOs have faced: cannibalization. What is it, how do you identify it, and how can you fix it? Watch to find out! 

Photo of the whiteboard describing cannibalization.Click on the whiteboard image above to open a larger version in a new tab!

Video Transcription

Happy Friday, Moz fans, and today we’re going to be talking about cannibalization, which here in the UK we spell like this: cannibalisation. With that out of the way, what do we mean by cannibalization?

What is cannibalization?

So this is basically where one site has two competing URLs and performs, we suspect, less well because of it. So maybe we think the site is splitting its equity between its two different URLs, or maybe Google is getting confused about which one to show. Or maybe Google considers it a duplicate content problem or something like that. One way or another, the site does less well as a result of having two URLs. 

So I’ve got this imaginary SERP here as an example. So imagine that Moz is trying to rank for the keyword “burgers.” Just imagine that Moz has decided to take a wild tangent in its business model and we’re going to try and rank for “burgers” now.

So in position one here, we’ve got Inferior Bergz, and we would hope to outrank these people really, but for some reason we’re not doing. Then in position two, we’ve got Moz’s Buy Burgers page on the moz.com/shop subdirectory, which obviously doesn’t exist, but this is a hypothetical. This is a commercial landing page where you can go and purchase a burger. 

Then in position three, we’ve got this Best Burgers page on the Moz blog. It’s more informational. It’s telling you what are the attributes to a good burger, how can you identify a good burger, where should you go to acquire a good burger, all this kind of more neutral editorial information.

So we hypothesize in this situation that maybe if Moz only had one page going for this keyword, maybe it could actually supplant the top spot. If we think that’s the case, then we would probably talk about this as cannibalization.

However, the alternative hypothesis is, well, actually there could be two intents here. It might be that Google wishes to show a commercial page and an informational page on this SERP, and it so happens that the second best commercial page is Moz’s and the best informational page is also Moz’s. We’ve heard Google talk in recent years or representatives of Google talk in recent years about having positions on search results that are sort of reserved for certain kinds of results, that might be reserved for an informational result or something like that. So this doesn’t necessarily mean there’s cannibalization. So we’re going to talk a little bit later on about how we might sort of disambiguate a situation like this.

Classic cannibalization

First, though, let’s talk about the classic case. So the classic, really clear-cut, really obvious case of cannibalization is where you see a graph like this one. 

Hand drawn graph showing ranking consequences of cannibalization.

So this is the kind of graph you would see a lot of rank tracking software. You can see time and the days of the week going along the bottom axis. Then we’ve got rank, and we obviously want to be as high as possible and close to position one.

Then we see the two URLS, which are color-coded, and are green and red here. When one of them ranks, the other just falls away to oblivion, isn’t even in the top 100. There’s only ever one appearing at the same time, and they sort of supplant each other in the SERP. When we see this kind of behavior, we can be pretty confident that what we’re seeing is some kind of cannibalization.

Less-obvious cases

Sometimes it’s less obvious though. So a good example that I found recently is if, or at least in my case, if I Google search Naples, as in the place name, I see Wikipedia ranking first and second. The Wikipedia page ranking first was about Naples, Italy, and the Wikipedia page at second was about Naples, Florida.

Now I do not think that Wikipedia is cannibalizing itself in that situation. I think that they just happen to have… Google had decided that this SERP is ambiguous and that this keyword “Naples” requires multiple intents to be served, and Wikipedia happens to be the best page for two of those intents.

So I wouldn’t go to Wikipedia and say, “Oh, you need to combine these two pages into a Naples, Florida and Italy page” or something like that. That’s clearly not necessary. 

Questions to ask 

So if you want to figure out in that kind of more ambiguous case whether there’s cannibalization going on, then there are some questions we might ask ourselves.

1. Do we think we’re underperforming? 

So one of the best questions we might ask, which is a difficult one in SEO, is: Do we think we’re underperforming? So I know every SEO in the world feels like their site deserves to rank higher, well, maybe most. But do we have other examples of very similar keywords where we only have one page, where we’re doing significantly better? Or was it the case that when we introduced the second page, we suddenly collapsed? Because if we see behavior like that, then that might,  you know, it’s not clear-cut, but it might give us some suspicions. 

2. Do competing pages both appear? 

Similarly, if we look at examples of similar keywords that are less ambiguous in intent, so perhaps in the burgers case, if the SERP for “best burgers” and the SERP for “buy burgers,” if those two keywords had completely different results in general, then we might think, oh, okay, we should have two separate pages here, and we just need to make sure that they’re clearly differentiated.

But if actually it’s the same pages appearing on all of those keywords, we might want to consider having one page as well because that seems to be what Google is preferring. It’s not really separating out these intents. So that’s the kind of thing we can look for is, like I say, not clear-cut but a bit of a hint. 

3. Consolidate or differentiate? 

Once we’ve figured out whether we want to have two pages or one, or whether we think the best solution in this case is to have two pages or one, we’re going to want to either consolidate or differentiate.

So if we think there should only be one page, we might want to take our two pages, combine the best of the content, pick the strongest URL in terms of backlinks and history and so on, and redirect the other URL to this combined page that has the best content, that serves the slight variance of what we now know is one intent and so on and so forth.

If we want two pages, then obviously we don’t want them to cannibalize. So we need to make sure that they’re clearly differentiated. Now what often happens here is a commercial page, like this Buy Burgers page, ironically for SEO reasons, there might be a block of text at the bottom with a bunch of editorial or SEO text about burgers, and that can make it quite confusing what intent this page is serving.

Similarly, on this page, we might at some stage have decided that we want to feature some products on there or something. It might have started looking quite commercial. So we need to make sure that if we’re going to have both of these, that they are very clearly speaking to separate intents and not containing the same information and the same keywords for the most part and that kind of thing.

Quick tip

Lastly, it would be better if we didn’t get into the situation in the first place. So a quick tip that I would recommend, just as a last takeaway, is before you produce a piece of content, say for example before I produced this Whiteboard Friday, I did a site:moz.com cannibalization so I can see what content had previously existed on Moz.com that was about cannibalization.

I can see, oh, this piece is very old, so we might — it’s a very old Whiteboard Friday, so we might consider redirecting it. This piece mentions cannibalization, so it’s not really about that. It’s maybe about something else. So as long as it’s not targeting that keyword we should be fine and so on and so forth. Just think about what other pieces exist, because if there is something that’s basically targeting the same keyword, then obviously you might want to consider consolidating or redirecting or maybe just updating the old piece.

That’s all for today. Thank you very much.

Video transcription by Speechpad.com. 

]]>
https://luckyjoyspins.org/cannibalization/feed/ 0
Retail media networks and advertisers going from guesswork to growth https://luckyjoyspins.org/retail-media-networks-and-advertisers-going-from-guesswork-to-growth/ https://luckyjoyspins.org/retail-media-networks-and-advertisers-going-from-guesswork-to-growth/#respond Fri, 09 Aug 2024 12:37:54 +0000 https://luckyjoyspins.org/?p=72390

Retail media networks (RMNs) are a fast-growing space in advertising — up 16.3% last year. Retailers are adding channels and providing better measurement and execution for advertisers. And agencies are answering their clients’ calls to include RMNs in the mix. 

While brands are increasing their spend, there are still growing pains around the lack of standardization. Without the ability to compare apples to apples from one RMN to another, brands and agencies are left with more number-crunching and guessing. However, the opportunity to reach high-intent customers within a retailer’s network is too powerful to pass up.

So what’s the current state of the RMN-advertiser relationship? That was the focus of an IAB roundtable this week featuring agency and RMN representatives.

IAB introduced media measurement guidelines in September 2023 so RMNs and adtech partners could begin to help brands and agencies connect the dots. If it’s easier for advertisers to execute and measure campaigns across multiple RMNs, they’ll increase their overall spend, the theory goes.

“We were very cognizant that adoption [of measurement guidelines] would take 12 to 24 months,” said Jeffrey Bustos, VP, measurement addressability data at IAB. “I think from all the conversations with retailers, the momentum and their investment in moving toward standardization is still very strong.”

While standardization remains a goal rather than a reality, the increasing opportunities within individual RMNs as they expand offsite and in-store offerings make them something advertisers must consider.

CVS Media Exchange’s focus on shopping experience

“When you look at retail and consumer spending, a lot of that still happens in a store format,” said Praveen Menon, head of analytics and business intelligence for CVS Media Exchange (CMX). “Someone like CVS, who has 9,000 stores, [attracts shoppers] coming into these stores with a lot of attention and urgency, with a clear consumer need they’re looking to fulfill.”

CVS Media Exchange can measure the connection between when loyalty members search in the CVS app or online and when they purchase in-store through the loyalty program.

“Over 50% of our loyalty members, when they browse something online, they come to our store within 48 hours,” said Menon. “For us, the in-store experience is a key unlock.”

CVS plans to build better shopping and ad experiences through in-store screens and other touchpoints. The retailer has 1,700 screens in its stores and plans to add more this year.

Dig deeper: Why we care about RMNs

Data collaboration for offsite measurement needs

At some level, all RMNs serve the function of a data collaboration. The retailer uses customer data in a privacy-safe way to help outside parties serve timely, relevant messages. This collaborative environment extends to offsite channels beyond an RMN’s owned digital and in-store properties.

For instance, CVS partnered with visual discovery platform Pinterest and adtech company LiveRamp on a clean room initiative that helps CVS measure the effectiveness of offsite ads on Pinterest that lead to CVS purchases by loyalty members.

“What data clean rooms allows us to do is to provide advertisers and agencies with precise, attributed and meaningful performance insights for campaigns they run with CMX, especially on offsite channels where we have limited ad exposure data,” said Menon. “This really allows us to report back about performance with a high degree of accuracy.”

He added: “On some other platforms, where they don’t share some of those ad exposures or don’t collaborate via clean rooms, measurement is more of a black-box solution. So for us, as we think about our innovative offerings, we want every channel that brands invest in through CMX to be measurable.”

Advertisers and agencies push for standards

This collaborative environment might one day lead to multiple RMNs adopting the same measurement standards. That’s at least the intention with the IAB guidelines. Agencies are seeing great RMN interest from clients. Agencies and brands have influence, through the budget they spend, to support RMNs that are standardized.

“We have to put our money where our solutions are,” said Evan Hovorka, VP of product innovation, Albertsons Media Collective. “That might require some brands forcing the next dollar to go to an RMN that supports some of these adopted standards.”

“What’s really interesting is that our [clients] are so excited to adapt and test and learn, because they’ve seen the success of retail media,” said Kavita Cariapa, SVP, head of commerce activation at dentsu. “And luckily there’s actually a good framework being built out with partners like CVS and Albertson’s. There’s a groundwork built on tech partnerships, collaboration.”

“Outside of its adoption rates, there are a couple of [IAB member RMNs] that have done fantastic jobs,” said Riyaad Edoo, executive director of commerce at EssenceMediacom. “Accountability is really what holds progress back. If my investment (on behalf of a brand client) isn’t going to directly reflect what my asks are, then those asks are empty. But that might not be the right conversation. In most instances, we can survive on some amount of measurement inconsistency and not be happy about it. We do have advanced data analytics teams working tirelessly to sort of weather these anomalies.”

“One of our biggest wins that we present to our brands is, baseline, how to inform a channel investment,” said Cariapa. “There’s always questions on how and when and where to invest. How do they invest in a retailer as they expand their opportunities?

Dentsu reports how clients are growing market share with campaigns and how specific channels perform, so brands aren’t just tracking impressions but results, Cariapa said.

Less than halfway through IAB’s two-year estimated timeframe for guideline adoption, it looks like better RMN measurement is likely, as advertisers and agencies continue to invest in this growing ecosystem.

Are you getting the most from your stack? Take our brief 2024 MarTech Replacement Survey

Email:

Business email address

Sign me up!
   Processing…

See terms.

The post Retail media networks and advertisers going from guesswork to growth appeared first on MarTech.

]]>
https://luckyjoyspins.org/retail-media-networks-and-advertisers-going-from-guesswork-to-growth/feed/ 0
Virtual Avatars fueled by Generative AI increasing B2B sales and marketing scalability https://luckyjoyspins.org/virtual-avatars-fueled-by-generative-ai-increasing-b2b-sales-and-marketing-scalability/ https://luckyjoyspins.org/virtual-avatars-fueled-by-generative-ai-increasing-b2b-sales-and-marketing-scalability/#respond Fri, 09 Aug 2024 12:37:04 +0000 https://luckyjoyspins.org/?p=72388

Vidyard, a popular video messaging company, has recently announced its entry into the virtual digital avatar market and secured new funding of $15M from Export Development Canada (EDC), BMO Capital Partners, and existing investors Battery Ventures, Bessemer Venture Partners, and iNovia Capital. This brings the company’s total funding to $90.7M.

According to Gartner Research, AI avatars using generative AI technology will support 70% of digital and marketing communications by 2025, up from less than 5% in 2022. The AI avatar market is valued at USD 14.34B in 2022 and is expected to grow at a compound annual growth rate (CAGR) of 47.1% from 2023 to 2030. The growth is driven by increased demand for remote collaboration tools, customer preferences for video content in corporate communications,  hybrid work environments, and advancements in video streaming technology.

Jonathan Lister, Chief Operating Officer at Vidyard says “the increasing demand for authentic and personalized characters for online interactions by companies from various industries is opening up a future where AI-powered video communications can become the cornerstone of productive connections between sellers and buyers.”

Vidyard’s advantage over leading AI video and avatar providers, such as Synthesia, HeyGen, and Tavus, is that its products are embedded in a sales workflow, making it easy to create personalized videos and send them to buyers all within the sales ecosystem. Market leaders like Microsoft, Marketo, and Hubspot are already using Vidyard.

Hubspot’s VP of Platform Ecosystem, Scott Brinker, who has been testing Vidyard’s AI Avatar platform, says that “Virtual avatars will address a lot of critical pain points for HubSpot users, including the stage fright associated with traditional video recording, the massive amount of time it takes to get them ‘right,’ the difficulty of finding the right place and time to do it, and the fact that it’s rarely their top priority.”

In a recent customer survey by GTM Partners, Vidyard found that when video was added, sales funnel performance increased by 85%, four to five times more meetings were booked. In addition, two to four times more sales-qualified opportunities were generated. This resulted in a 25% increase in close rates.

According to Gartner, AI’s capability to help sellers engage more effectively with prospects and customers and to do so at scale by automating labor-intensive tasks is one of the most valuable contributions to using AI in sales.  As communications continue to shift in customer preferences, virtual avatars are making it extremely easy for sales and marketers to get messages out highly personalized and improve customer and buyer interactions.

Kelvin Beachum, a National Football Athlete of the Arizona Cardinals, has found Vidyard’s virtual avatar technology useful. He says “AI technology is just beginning to crack the surface of its capabilities. For me, time is money. As an organization, we are always trying to be efficient with our time to accomplish as much as possible at a high level. Using AI for efficiency allows for a high-velocity workflow of my daily operations, and I am excited to continue implementing it.”

Lister says “B2B buying and selling is fundamentally broken. Buyers want timely and relevant information and automation that helps simplify their complex buying journey. While the sellers struggle to build meaningful relationships, communicate with an ever-increasing number of stakeholders, and deliver high-quality insights given the limitations of current sales technology.”

As this market continues to unfold, there is a need to keep a vigilant eye on deep fakes and the increasing risks of video communication. Stanford has released guidelines on digital communication ethics to help mitigate deep fake risks.

]]>
https://luckyjoyspins.org/virtual-avatars-fueled-by-generative-ai-increasing-b2b-sales-and-marketing-scalability/feed/ 0
Off the back of US strike, UK union sets minimum pay for video game performers https://luckyjoyspins.org/off-the-back-of-us-strike-uk-union-sets-minimum-pay-for-video-game-performers/ https://luckyjoyspins.org/off-the-back-of-us-strike-uk-union-sets-minimum-pay-for-video-game-performers/#respond Fri, 09 Aug 2024 12:34:11 +0000 https://luckyjoyspins.org/?p=72385

No one was surprised when British actor Neil Newbon won the “Best Performance” category at The Game Awards last year for his role as Astarion in Baldur’s Gate 3 — beating Hollywood royalty Idris Elba’s appearance in Cyberpunk 2077: Phantom Liberty. (However, it is safe to assume that Elba, while also British may have been offered significantly more for his role.) 

Disparities in pay for video game performers have become fiercely contentious. The UK’s performing arts and entertainment trade union, Equity, has proposed a solution. The union announced today it has set recommended minimum rates for video game performers, hoping to bridge the pay gap to their North American colleagues.

Video game performers have traditionally not made a lot of money from their contributions, even for highly popular franchises. The rise of AI and synthetic media has not done much to improve the situation. 

Game On! for AI protection and fair pay

After negotiations around worker protection from AI between game publishers such as Activision Blizzard, Take-Two, WB Games and unions broke down a couple of weeks ago, the US Screen Actors Guild called a strike. 

“Eighteen months of negotiations have shown us that our employers are not interested in fair, reasonable AI protections, but rather flagrant exploitation,” said interactive media agreement negotiating committee chair Sarah Elmaleh.

Spurred on by the controversy across the pond, Equity has taken the opportunity to set a recommended minimum for the payment of video game performers in the UK for the very first time. 

“It comes as Equity stands in solidarity with our sister union, SAG-AFTRA, who are on strike against major video game companies in the US,” Equity said in a statement. “Many of the issues experienced by SAG-AFTRA members are shared by Equity members, such as the threat of artificial intelligence.” 

However, unlike in the US, Equity stated, there is no collectively bargained agreement with video games companies to set union-agreed minimum rates and pay performers fairly.

The recommendations are part of Equity’s Game On! Campaign, and intended to “address systemic low pay for performers working in video games in the UK,” which the Union says has stagnated and not kept up with the rate of inflation.

Furthermore, UK video game performers are paid far less than their colleagues in the US or Canada, despite being hired by the same studios. 

The rates are intended for AAA games. That is, games that are high-budget, high-profile and produced and distributed by major game publishers.

They include minimum fees for, among other things, voiceover work per hour, motion and performance capture per day, promotional activity per hour, overtime per 30 mins past wrap time, etc. You can find them in their entirety here. 

Safeguarding the art form of video games

In the words of YouTuber Redbeardflynn, while gaming itself may not be one of those things that some people associate with art, it is “absolutely an art form. It is in a lot of ways the combination of multiple forms of art coming together to create a piece of art that you can actually interact with.” 

It is essential for that art form that we continue to have performances like Neil Newbon’s Astarion, Ashley Johnson’s Ellie from The Last of Us, Christopher Judge’s Kratos in God of War, and Jennifer Hale in, well, just about anything. 

But it would also be made so much poorer if minor roles were to be replaced by AI-generated voices, and performers were not able to make a living off their craft.

“Our goal is to ensure fair pay and good working conditions for the performers who have trained for years to develop the skills they use to bring video games to life,” Equity said. “We urge Equity members to demand these minimum standards and we invite studios and developers to work with us on collective agreements that protect everyone and will ensure the games industry in the UK continues to thrive.”

]]>
https://luckyjoyspins.org/off-the-back-of-us-strike-uk-union-sets-minimum-pay-for-video-game-performers/feed/ 0
Space balloon for tourists set for landmark test flight https://luckyjoyspins.org/space-balloon-for-tourists-set-for-landmark-test-flight/ https://luckyjoyspins.org/space-balloon-for-tourists-set-for-landmark-test-flight/#respond Fri, 09 Aug 2024 12:32:08 +0000 https://luckyjoyspins.org/?p=72382

A “space balloon” for tourists is set for a test flight in Saudi Arabia this September.

Spanish startup Halo Space built the balloon for zero-emission trips into the stratosphere. Each ticket is expected to cost around €150,000.

On each flight, Halo plans to welcome aboard eight passengers and one pilot. The balloon will then ascend to altitudes of up to 35km. From this lofty perch in the sky, the tourists will enjoy views of the cosmos.

But before that vision becomes reality, Halo needs to prove that the balloon is safe. The Saudi test aims to produce fresh evidence.

TNW Conference 2025 – Back to NDSM on June 19-20, 2025 – Save the date!

As we wrapped up our incredible 2024 edition, we’re pleased to announce our return to Amsterdam NDSM in 2025. Registration now!

If all goes to plan, the real-size prototype capsule — Aurora —  will rise to 30km above the Earth. Throughout the flight, Halo will probe the technology’s performance.

 “This mission is designed to meticulously validate all our critical systems we’ve been developing for the past three years,” said Alberto Castrillo, the startup’s CTO.

The space ballon has already entered a hangar in Riyadh. Credit: Halo Space
Saudi Arabia’s space ambition

Saudi Arabia has been a key partner for Halo. The country plans to become a global leader in near-space exploration and has given substantial support to the sector.

Halo has welcomed the assistance. The company has established its flagship operational base and final assembly site in the kingdom.

The startup has also worked closely with the kingdom’s regulator for the sector, the Communications, Space, and Technology Commission (CST).

Both CST and Halo have pledged to prioritise safety during the test flight.

If the mission is successful, space tourists could soon purchase tickets. Halo plans to launch crewed flights next year. Commercial trips are set to follow in 2026.

]]>
https://luckyjoyspins.org/space-balloon-for-tourists-set-for-landmark-test-flight/feed/ 0
ASML’s latest machine powers new breakthroughs for logic and memory chips https://luckyjoyspins.org/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/ https://luckyjoyspins.org/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/#respond Fri, 09 Aug 2024 12:30:15 +0000 https://luckyjoyspins.org/?p=72379

Imec, a leading semiconductor research company based in Belgium, today announced a series of chipmaking breakthroughs at its joint lab with ASML.

The lab opened its doors in June with the aim to provide ecosystem partners with early access to the High NA EUV prototype scanner.

The High NA machine represents the latest advancement in extreme ultraviolet (EUV) lithography systems, which use light to draw chip patterns on silicon wafers. It’s ASML’s most high-end tool to date.

Now, imec says that the use of the technology has already yielded impressive results.

The first is the successful printing of circuit patterns for logic chips with 9,5nm dense metal lines, which correspond to a 19nm pitch and sub-20nm tip-to-tip dimensions. This essentially translates to super small and densely-packed circuits that can deliver more powerful and efficient chips.

imec also demonstrated High NA lithography’s capability to enable 2D routing in logic chips. 2D routing refers to the process of creating intricate pathways on a single layer of a chip to connect various components or circuit elements.

In addition, the Belgian research company said it has managed to pattern designs for Dynamic Random-Access Memory (DRAM) chips in a single exposure.

Exposure is the process when light is used to transfer a pattern onto a semiconductor wafer. DRAM chips typically require multiple exposures, which means that reducing the number to one can significantly bring manufacturing costs down.

The patterns for the logic chips were also printed in a single exposure.

According to imec, these early demonstrations confirm ASML’s claims that the High NA machines can manufacture smaller, faster, and more energy-efficient chips.

“High NA EUV will therefore be highly instrumental to continue the dimensional scaling of logic and memory technologies,” said Luc Van den hove, president and CEO of imec.

Van den hove also pointed out that these demonstrations will “accelerate” the introduction of these tools into manufacturing.

Intel was reportedly the first to purchase ASML’s High NA machines in May this year. The list of buyers also includes Samsung and chip giant TSMC, although the latter was initially sceptical about the technology.

]]>
https://luckyjoyspins.org/asmls-latest-machine-powers-new-breakthroughs-for-logic-and-memory-chips/feed/ 0