google-site-verification: googlebc47d07320294fb4.html

How SMBs are approaching digital marketing in 2024

A new report released this month by Intero Digital (disclosure: I previously worked with Intero), “Revolutionizing Reach: The 2024 Digital Marketing Landscape You Can’t Ignore,” checks the yearly pulse for the digital spending and marketing activities of its SMB client list. 

The report covers a wide digital landscape and includes the results of their client survey, as well as insights and strategies for addressing the pain points and goals marketers face this year. Some of the key coverage areas include:

  • Digital marketing spending trends.
  • Top digital marketing goals for 2024.
  • Content marketing trends.
  • Emerging trends and data in Amazon marketing.
  • Emerging trends in paid advertising.
  • Transformation in the SEO space.
  • Influencer marketing.
  • Video marketing tactics and strategy.

Let’s look at a few highlights and key insights from the report.

Key digital marketing priorities for SMBs in 2024

The first finding should not surprise those actively engaged in digital marketing across various industries, verticals, niches and three-letter acronyms like “B2C” and “B2B”. The most important thing on digital marketer’s minds in 2024 is revenue growth, prioritized as the top goal by 79% of respondents. 

While “growth” is something of a given for most marketers, it is still worth stating, if only to get it out of the way and focus on the digital marketing strategies and activities that actually drive it. But for the digital marketers in this survey, what really matters is the final result – the bottom line.

Marketers aren’t backing down this year in their quest for growth, with 46.2% planning to increase spending, zero percent decreasing and 53.8% planning to stay the same.

Digital marketing spend in 2024 - survey question

The biggest digital marketing pain points for 2024

The path to revenue growth and lead generation is filled with obstacles, and the survey respondents also outlined many of their common and shared challenges.

Biggest pain points in 2024 - survey question

Aside from revenue, the top pain points were lead generation, budget constraints, competition and changing algorithms and SEO best practices. While these points are cited individually, they are oftentimes interconnected. 

Competition is fierce everywhere marketers are chasing the bottom line across many different digital channels, and it doesn’t appear things will settle down any time soon. 

When competition is fierce, more investment is often needed to be competitive, be it paid or earned media. This leads to the pain point of effectively measuring digital marketing ROI.

Budget constraints often stem from ineffective measurement and tracking systems. Ultimately, what’s crucial is a comprehensive and well-executed digital marketing strategy and tactical plan, whether focusing on a single channel or across multiple channels.

Up to 7.7% of respondents had branding as a pain point. Building a brand across crowded and competitive digital spaces can be challenging this year, as it takes a lot of work, budget and time. Combined with multiple other pain points, marketers have a lot of work ahead of them this year.

The top digital marketing goals for 2024

Biggest goals in 2024 - survey question

In the “Biggest Goals” section of the survey, “Revenue growth” topped the list at 79.5%, then dropped a bit for each of the supporting goals. In second place were “More web traffic” and “Higher positioning in the SERPs” (each at 38.5%). 

Interestingly, while lead generation was the number two biggest pain point (33.3%), it came in fifth place for the biggest goal (30.8%). Are the lead generators giving up? Nope. With the close percentages, it appears they’re consistent with their priorities.

It is also interesting to see brand awareness and thought leadership prominently in this goal list (33.3%), especially when it is often difficult to attribute to revenue and it was not as much of a pain point. 

The biggest barrier to entry for thought leadership is actually having the best ideas and insight. For those who do, this can be one of the easiest goals to accomplish in 2024.

Again, when done well, it can complement and drive goals of more traffic, higher positioning, more growth, more qualified leads and so on.

Content marketing insights

Another interesting part of this study was a review of over 100,00 pieces of content. In their analysis, Intero found that the “most-shared” content had an average word count of 552, down considerably from the previous year’s analysis of 792 words. 

The implication is that messaging was more important than being wordy, at least in the social context. For marketers competing in the SEO space with higher word counts, a balance should be struck in the content strategy to emphasize both channels through content.

Marketing publishers are also looking to diversify their content in 2024 beyond the written word. Some top asset types utilized by digital marketers this year include video (41% of marketers utilizing), podcasts (41%) and the top diversified activity, newsletters (47%).

Top asset types for SMBs in 2024

About the report

This article touches on only a few points from the report, providing strategic insights on digital marketing tactics and approaches for the year.

Will next year’s survey bring significant changes, or will things remain unchanged? It’s uncertain, but one thing is clear: the digital landscape is evolving rapidly, offering opportunities for early adopters to thrive in various channels despite competition.

All images in this report were provided by Intero Digital. You can download the full report here

Original source:

Google: Apple rejected Microsoft Bing deal over quality and advertising worries

Microsoft reportedly tried to sell Bing to Apple in 2018 – but failed.

The sale was unsuccessful because Apple had concerns about Bing’s search quality, advertising unit and monetization capabilities, according to a document in Google’s antitrust case against the U.S. Justice Department.

Google argues that the failed sale highlights that Bing lost out because it is an inferior search engine, not because Google has made the search market unfair for its competitors, per CNN.

Apple turns down Microsoft. Microsoft allegedly proposed either selling Bing to Apple or creating a Bing-joint venture together. Eddy Cue, Apple’s senior vice president of services, explained why Apple said no to Microsoft in the filing:

  • “Microsoft search quality, their investment in search, everything was not significant at all. And so everything was lower. So the search quality itself wasn’t as good.”
  • “They weren’t investing at any level comparable to Google or to what Microsoft could invest in. And their advertising organization and how they monetize was not very good either.”

What Google is saying. Google claims that Microsoft’s failed attempt to sell Bing to Apple demonstrates that Bing is an inferior search engine. Google wrote in the filing:

  • “In each instance, Apple took a hard look at the relative quality of Bing versus Google and concluded that Google was the superior default choice for its Safari users. That is competition.”

Google also argues that Microsoft approached Apple multiple times – in 2009, 2013, 2015, 2016, 2018, and 2020 – proposing to make Bing the default search engine in Apple’s Safari web browser. However, Apple consistently rejected the idea, citing concerns about the quality of Bing’s performance.

What Microsoft is saying. Microsoft denies its attempts to sell Bing to Apple were due to quality issues – on the contrary, the tech giant claims the sale was blocked by Google. Microsoft’s CEO of Advertising and Web Services, Mikhail Parakhin, said he offered Apple more than 100% of the revenue or gross profit to make Bing its default search engine – but the proposal was rejected because of the Apple’s deal with Google.

Speaking at the federal antitrust trial, Parakhin alleged this was despite Microsoft offering to pay Apple more than Google – which he claims was offering in the region of 60%. He said:

  • “We were just big enough to play but probably not big enough to win, if that makes sense.”

What the DOJ is saying. Google has a financial agreement in place with Apple to be the default search engine on its products at a cost of around $10 billion a year. In addition, Google pays Apple advertising revenue – which is one of the search engine’s biggest costs. The Justice Department asserted in its filing that Apple boss Cue testified:

  • “If Apple did not receive the massive payments it sought from Google, Apple would have developed its own search engine,” the Justice Department asserted in its filing.”

Google could pay Apple as much as $19 billion this fiscal year, according to an estimate from Bernstein analyst Toni Sacconaghi. The DOJ claims these substantial sums of money are why Apple turned down a potential joint venture with Microsoft – not because Bing is inferior to Google.

Why we care. The U.S. Justice Department is suing Google, claiming it uses unfair tactics to maintain its position as the top search engine globally. If Google can demonstrate its search engine’s superiority, it might succeed in the case. However, if it loses, Google could lose its automatic default search engine status on common Apple products, posing a potential threat to its dominance in the search market.

Market share statistics. The U.S. Justice Department claims Google owns a 90% market share in search. On the other hand, Bing has 3% of the global market share, according to StatCounter.

Get the daily newsletter search marketers rely on.

Deep dive. Read our Google antitrust trial updates for everything you need to know.

Original source:

Eco-Friendly Plumbing Solutions for Orange County Residents

Home Business Magazine Online

Orange County residents are increasingly seeking eco-friendly plumbing solutions to contribute to environmental conservation while also reducing their household water and energy bills. Modern plumbing techniques and technologies offer a range of options for homeowners who want to make their homes more sustainable. These solutions include low-flow fixtures, such as toilets and showerheads, tankless water heaters, and greywater recycling systems. Each of these options helps in minimizing water usage and preserving the precious water resources in Southern California.

Implementing eco-friendly plumbing practices is not just beneficial for the environment; it also aligns with Orange County’s progressive outlook on ecological responsibility. Local regulations and incentives often encourage the adoption of green plumbing, making it an attractive choice for residents. Moreover, the long-term cost savings and improved efficiency can be a significant advantage for homeowners. By incorporating these solutions, Orange County residents can enjoy the dual benefits of helping the planet and optimizing their home’s plumbing systems.

To ensure effective and safe installation, it is imperative for homeowners to work with licensed plumbers who specialize in eco-friendly plumbing solutions. Proper installation and maintenance are crucial to realizing the full benefits of these systems. With a growing emphasis on sustainability, Orange County’s plumbing industry continues to evolve, providing residents with innovative and environmentally sound options to meet their plumbing needs.

Innovative Eco-Friendly Plumbing Technologies

As Orange County residents look to conserve water and reduce utility bills, innovative eco-friendly plumbing technologies offer practical solutions. These advancements aim to minimize water wastage and promote sustainability.

Water-Saving Fixtures and Appliances

Companies like Socal Plumbing & Rooter advocate for the installation of water-saving fixtures and appliances to significantly cut down on water consumption. Low-flow toilets, which use less than 1.6 gallons per flush, can save thousands of gallons of water annually. Similarly, water-efficient showerheads and faucet aerators are designed to reduce flow without compromising performance.

  • Low-flow toilets
    • Usage: < 1.6 gallons per flush
    • Savings: Up to 20,000 gallons per year for a family of four
  • Showerheads
    • Flow rate: Often below 2.0 gallons per minute
    • Feature: Aerating or non-aerating spray patterns
  • Faucet aerators
    • Reduction: Decrease flow by adding air

Faucet aerators not only conserve water but also reduce the energy costs associated with heating water.

Tankless Water Heaters

Tankless water heaters, also known as on-demand water heaters, provide hot water only as it is needed, eliminating the need to constantly heat a storage tank. This results in energy savings as well as reduced water usage since residents don’t have to run the tap while waiting for hot water to arrive from a distant heater.

  • Energy Efficiency: Up to 34% more energy-efficient than traditional water heaters for homes that use 41 gallons or less of hot water daily.

Greywater Recycling Systems

Greywater recycling systems offer an innovative way to reuse water from sinks, showers, and laundry, redirecting it for non-potable uses such as toilet flushing and irrigation. This eco-friendly plumbing solution can dramatically decrease freshwater consumption.

  • Reuse: Greywater can account for 50-80% of residential wastewater.
  • Applications: Suitable for landscape irrigation and toilet flushing.

Installations must comply with local codes and guidelines to ensure health and safety. Socal Plumbing & Rooter provides expertise in setting up greywater systems that conform to these standards.

By adopting these technologies, residents can play a pivotal role in water conservation, fostering a more sustainable future for Orange County.

The post Eco-Friendly Plumbing Solutions for Orange County Residents appeared first on Home Business Magazine.

Original source:

Deicing Dilemmas: Choosing the Right Deicer

Home Business Magazine Online

Winter’s frosty grip transforms the landscapes into breathtaking wonderlands, but beneath the shimmering snow and ice lurks a hidden danger: slippery surfaces. Every year, icy sidewalks, driveways, and roads contribute to countless slips, falls, and accidents, disrupting lives and businesses alike. While the urge to combat the ice is strong, simply ordering bulk calcium chloride might not be the wisest decision. Choosing the right deicer for your specific needs is crucial, and this guide aims to equip you with the knowledge to navigate the winter battlefield victoriously.

The Winter Wonderland Nightmare

While snow may paint a picturesque scene, hidden beneath its sparkling blanket lies a silent threat: ice. Every winter, these seemingly innocent frozen surfaces transform sidewalks, driveways, and roads into treacherous obstacle courses. The consequences of a fall on ice can be severe, ranging from minor bumps and bruises to serious injuries like broken bones and head trauma. Statistics reveal a grim reality: falls on ice account for countless injuries and even fatalities each year, impacting individuals, families, and communities alike.

While resorting to any available road dust control products for sale might be tempting for certain situations like unpaved parking lots, understanding the specific threats posed by ice and the limitations of alternative solutions is crucial for making informed decisions about winter safety. Fortunately, a wide range of deicers exists, each with its strengths and weaknesses, to help us combat this seasonal adversary.

Deicer Diversity: Unveiling the Options

Stepping into the realm of deicers can feel overwhelming, much like facing a blizzard head-on. But fear not, for beneath the seemingly complex facade lies a collection of readily available warriors ready to fight the icy foe. Here are the main protagonists in the deicing drama of the right deicer:

  • Rock Salt (Sodium Chloride): The classic hero, known for its affordability and wide availability. It works effectively at moderate temperatures (above 20°F), melting ice through an endothermic process (absorbing heat from the surroundings). However, its effectiveness wanes at colder temperatures and it can be harsh on concrete and vegetation.
  • Calcium Chloride: The exothermic champion, generating its own heat to melt ice even at temperatures as low as -25°F. This powerhouse comes at a higher cost than rock salt, and its potent nature requires caution regarding potential damage to plants and concrete. If you need to conquer extreme cold, calcium chloride is your knight in shining armor, but consider its “bulk calcium chloride” option for larger areas to keep costs in check.
  • Magnesium Chloride: The environmentally conscious contender, boasts a gentler touch on vegetation and concrete compared to its chloride counterparts. It works effectively down to 5°F but loses its melting power at colder temperatures. While not as fast-acting as calcium chloride, it might be the eco-friendly hero for milder climates.
  • Acetate-Based Deicers: The gentle giants, known for their minimal impact on concrete, plants, and even pets. They work best at slightly warmer temperatures (above 15°F) and rely on an endothermic process for melting. While more expensive than rock salt, their slower melting rate and environmental friendliness make them ideal for sensitive areas.

Choosing the right deicer depends on a unique battleground. Stay tuned as we explore the factors that will guide you to your deicing champion!

Choosing Your Champion: Factors to Consider

Now that you’ve met the deicing contenders, it’s time to strategize. Before selecting your champion, consider these key factors:

  • Temperature: The battlefield’s temperature dictates which warriors will thrive. Remember, rock salt reigns at moderate temperatures, calcium chloride conquers the coldest depths, magnesium chloride battles mid-range chills, and acetate-based deicers excel in slightly warmer zones. Consult road salt companies or deicer manufacturers for specific temperature ranges to ensure you choose the right weapon for the job.
  • Surface Type: Not all heroes fight on the same terrain. While rock salt and calcium chloride work well on concrete and asphalt, their harshness might damage delicate surfaces like wood or pavers. Magnesium chloride offers a gentler touch, and acetate-based deicers are often the safest choice for sensitive surfaces. Choose your deicer based on the specific battlefield you need to conquer.
  • Environmental Concerns: Consider the impact your deicing battle will have on the surrounding environment. While rock salt’s affordability is tempting, its runoff can harm vegetation and water quality. Calcium chloride demands caution due to its similar impact. Magnesium chloride offers a more eco-friendly approach, with acetate-based deicers being the gentlest on the environment. Remember, a responsible victory considers the well-being of all participants, not just humans.
  • Budget: Every hero has a price tag. Rock salt remains the most cost-effective option, followed by magnesium chloride. Calcium chloride comes at a premium, especially when considering its higher application rates. Acetate-based deicers are often the most expensive but might be worth the investment for sensitive areas or environmentally conscious battles.

Beyond the Basics: Advanced Deicing Strategies

While choosing the right deicer is crucial, seasoned warriors know that mastering the art of winter deicing goes beyond simply throwing down some granules. Here are some advanced strategies to enhance your deicing prowess:

  • Pre-wetting: This technique involves mixing your deicer with a brine solution before application. The brine lowers the freezing point of the deicer, allowing it to work effectively at even colder temperatures. Additionally, the pre-wetting solution helps the deicer adhere better to surfaces, increasing its efficiency and reducing waste.
  • Blending Deicers: Combining different deicers can leverage their individual strengths. For example, mixing rock salt with calcium chloride can extend the effective temperature range while reducing the harshness of calcium chloride alone. Consult deicer manufacturers or “road salt companies” for recommended blends based on your specific needs.
  • Heated Mats: For high-traffic areas like walkways or entrances, heated mats offer a permanent solution. These mats are embedded under the surface and generate heat to melt ice and snow automatically. While more expensive upfront, they can be cost-effective in the long run for heavily used areas.
  • Robotic Snow Removal: Cutting-edge technology is changing the game with robotic snow removal solutions. These autonomous machines can clear snow and apply deicers, enhancing efficiency and safety, especially in large areas like parking lots or commercial properties.
  • Dowflake Xtra: As an example of an innovative deicer, consider Dowflake Xtra. This calcium chloride blend boasts improved melting power and lower application rates compared to traditional calcium chloride, offering a potentially cost-effective solution for colder climates.

Remember, effective deicing is not just about choosing the right weapon, but also mastering the art of strategy and utilizing diverse tactics. By incorporating these advanced techniques, you can transform yourself from a winter warrior into a deicing champion, ensuring safe and clean surfaces throughout the coldest months.


Conquering winter’s icy grip requires knowledge and strategy. This guide has equipped you with the essential tools to navigate the world of deicers, understand their strengths and weaknesses, and choose the champion best suited for your battlefield. Remember, consider the temperature, surface type, environmental impact, and budget to make informed decisions. Explore advanced tactics like pre-wetting, blending, and innovative solutions to truly master the art of winter deicing. With the right approach, you can ensure safe and accessible walkways, driveways, and roads, transforming winter wonderlands into safe and welcoming spaces for everyone.

The post Deicing Dilemmas: Choosing the Right Deicer appeared first on Home Business Magazine.

Original source:

How to speed up site migrations with AI-powered redirect mapping

Migrating a large website is always daunting. Big traffic is at stake among many moving parts, technical challenges and stakeholder management.

Historically, one of the most onerous tasks in a migration plan has been redirect mapping. The painstaking process of matching URLs on your current site to the equivalent version on the new website.

Fortunately, this task that previously could involve teams of people combing through thousands of URLs can be drastically sped up with modern AI models.

Should you use AI for redirect mapping?

The term “AI” has become someone conflated with “ChatGPT” over the last year, so to be very clear from the outset, we are not talking about using generative AI/LLM-based systems to do your redirect mapping. 

While there are some tasks that tools like ChatGPT can assist you with, such as writing that tricky regex for the redirect logic, the generative element that can cause hallucinations could potentially create accuracy issues for us.

Advantages of using AI for redirect mapping


The primary advantage of using AI for redirect mapping is the sheer speed at which it can be done. An initial map of 10,000 URLs could be produced within a few minutes and human-reviewed within a few hours. Doing this process manually for a single person would usually be days of work.


Using AI to help map redirects is a method you can use on a site with 100 URLs or over 1,000,000. Large sites also tend to be more programmatic or templated, making similarity matching more accurate with these tools.


For larger sites, a multi-person job can easily be handled by a single person with the correct knowledge, freeing up colleagues to assist with other parts of the migration.


While the automated method will get some redirects “wrong,” in my experience, the overall accuracy of redirects has been higher, as the output can specify the similarity of the match, giving manual reviewers a guide on where their attention is most needed

Disadvantages of using AI for redirect mapping


Using automation tools can make people complacent and over-reliant on the output. With such an important task, a human review is always required.


The script is pre-written and the process is straightforward. However, it will be new to many people and environments such as Google Colab can be intimidating.

Output variance 

While the output is deterministic, the models will perform better on certain sites than others. Sometimes, the output can contain “silly” errors, which are obvious for a human to spot but harder for a machine.

A step-by-step guide for URL mapping with AI

By the end of this process, we are aiming to produce a spreadsheet that lists “from” and “to” URLs by mapping the origin URLs on our live website to the destination URLs on our staging (new) website.

For this example, to keep things simple, we will just be mapping our HTML pages, not additional assets such as CSS or images, although this is also possible.

Tools we’ll be using

  • Screaming Frog Website Crawler: A powerful and flexible website crawler, Screaming Frog is how we collect the URLs and associated metadata we need for the matching.
  • Google Colab: A free cloud service that uses a Jupyter notebook environment, allowing you to run a range of languages directly from your browser without having to install anything locally. Google Colab is how we are going to run our Python scripts to perform the URL matching.
  • Automated Redirect Matchmaker for Site Migrations: The Python script by Daniel Emery that we’ll be running in Colab.

Step 1: Crawl your live website with Screaming Frog

You’ll need to perform a standard crawl on your website. Depending on how your website is built, this may or may not require a JavaScript crawl. The goal is to produce a list of as many accessible pages on your site as possible.

Crawl your live website with Screaming Frog

Step 2: Export HTML pages with 200 Status Code

Once the crawl has been completed, we want to export all of the found HTML URLs with a 200 Status Code.

Firstly, in the top left-hand corner, we need to select “HTML” from the drop-down menu.

Screaming Frog - Highlighted- HTML filter

Next, click the sliders filter icon in the top right and create a filter for Status Codes containing 200.

Highlighted: Custom filter options

Finally, click on Export to save this data as a CSV.

Highlighted: Export button

This will provide you with a list of our current live URLs and all of the default metadata Screaming Frog collects about them, such as Titles and Header Tags. Save this file as origin.csv.

Important note: Your full migration plan needs to account for things such as existing 301 redirects and URLs that may get traffic on your site that are not accessible from an initial crawl. This guide is intended only to demonstrate part of this URL mapping process, it is not an exhaustive guide.

Step 3: Repeat steps 1 and 2 for your staging website

We now need to gather the same data from our staging website, so we have something to compare to.

Depending on how your staging site is secured, you may need to use features such as Screaming Frog’s forms authentication if password protected.

Once the crawl has completed, you should export the data and save this file as destination.csv.

Optional: Find and replace your staging site domain or subdomain to match your live site

It’s likely your staging website is either on a different subdomain, TLD or even domain that won’t match our actual destination URL. For this reason, I will use a Find and Replace function on my destination.csv to change the path to match the final live site subdomain, domain or TLD.

For example:

  • My live website is (origin.csv)
  • My staging website is (destination.csv)
  • The site is staying on the same domain; it’s just a redesign with different URLs, so I would open destination.csv and find any instance of and replace it with
Find and Replace in Excel

This also means when the redirect map is produced, the output is correct and only the final redirect logic needs to be written.

Step 4: Run the Google Colab Python script

When you navigate to the script in your browser, you will see it is broken up into several code blocks and hovering over each one will give you a”play” icon. This is if you wish to execute one block of code at a time.

However, the script will work perfectly just executing all of the code blocks, which you can do by going to the Runtime’menu and selecting Run all.

Google Colab Runtime

There are no prerequisites to run the script; it will create a cloud environment and on the first execution in your instance, it will take around one minute to install the required modules.

Each code block will have a small green tick next to it once it is complete, but the third code block will require your input to continue and it’s easy to miss as you’ll likely need to scroll down to see the prompt.

Get the daily newsletter search marketers rely on.

Step 5: Upload origin.csv and destination.csv

Highlighted: File upload prompt

When prompted, click Choose files and navigate to where you saved your origin.csv file. Once you have selected this file, it will upload and you will be prompted to do the same for your destination.csv.

Step 6: Select fields to use for similarity matching

What makes this script particularly powerful is the ability to use multiple sets of metadata for your comparison.

This means if you’re in a situation where you’re moving architecture where your URL Address is not comparable, you can run the similarity algorithm on other factors under your control, such as Page Titles or Headings.

Have a look at both sites and try and judge what you think are elements that remain fairly consistent between them. Generally, I would advise to start simple and add more fields if you are not getting the results you want.

In my example, we have kept a similar URL naming convention, although not identical and our page titles remain consistent as we are copying the content over.

Select the elements you to use and click the Let’s Go!

Similarity matching fields

Step 7: Watch the magic

The script’s main components are all-MiniLM-L6-v2 and FAISS, but what are they and what are they doing?

all-MiniLM-L6-v2 is a small and efficient model within the Microsoft series of MiniLM models which are designed for natural language processing tasks (NLP). MiniLM is going to convert our text data we’ve given it into numerical vectors that capture their meaning.

These vectors then enable the similarity search, performed by Facebook AI Similarity Search (FAISS), a library developed by Facebook AI Research for efficient similarity search and clustering of dense vectors. This will quickly find our most similar content pairs across the dataset.

Step 7: Download output.csv and sort by similarity_score

The output.csv should automatically download from your browser. If you open it, you should have three columns: origin_url, matched_url and similarity_score.

Output csv example

In your favorite spreadsheet software, I would recommend sorting by similarity_score

Excel Sort by similarity score

The similarity score gives you an idea of how good the match is. A similarity score of 1 suggests an exact match.

By checking my output file, I immediately saw that approximately 95% of my URLs have a similarity score of more than 0.98, so there is a good chance I’ve saved myself a lot of time.

Step 8: Human-validate your results

Pay special attention to the lowest similarity scores on your sheet; this is likely where no good matches can be found.

Output.csv: Lower-scored similarities

In my example, there were some poor matches on the team page, which led me to discover not all of the team profiles had yet been created on the staging site – a really helpful find.

The script has also quite helpfully given us redirect recommendations for old blog content we decided to axe and not include on the new website, but now we have a suggested redirect should we want to pass the traffic to something related – that’s ultimately your call.

Step 9: Tweak and repeat

If you didn’t get the desired results, I would double-check that the fields you use for matching are staying as consistent as possible between sites. If not, try a different field or group of fields and rerun.

More AI to come

In general, I have been slow to adopt any AI (especially generative AI) into the redirect mapping process, as the cost of mistakes can be high, and AI errors can sometimes be tricky to spot.

However, from my testing, I’ve found these specific AI models to be robust for this particular task and it has fundamentally changed how I approach site migrations. 

Human checking and oversight are still required, but the amount of time saved with the bulk of the work means you can do a more thorough and thoughtful human intervention and finish the task many hours ahead of where you would usually be.

In the not-too-distant future, I expect we’ll see more specific models that will allow us to take additional steps, including improving the speed and efficiency of the next step, the redirect logic.

Original source:

LinkedIn publishes actionable steps before third-party cookie deprecation

LinkedIn released a new guide detailing actionable steps and tactics you can take now to maximize campaign performance in the absence of third-party cookies.

Amid regulatory changes and reduced identifiers, the playbook offers advice on how to meet the demand for personalized ad experiences in a cookieless world.

Why we care. As the deprecation of third-party cookies looms, LinkedIn’s insights could be valuable in guiding approaches, but it’s important to note that the advice is published with LinkedIn’s interests in mind rather than solely benefiting brands and advertisers.

1. Create a robust first-party data strategy. LinkedIn recommends setting up a strong framework for collecting and using first-party data. To do this, the platform suggests utilizing its Audience Targeting tools, like Audience Insights and Matched Audiences. Start with a broad approach and conduct A/B testing on the same ad creative with diverse audiences to ensure your message is resonating. 

The platform also advises prioritizing building audience trust through clear and transparent messaging about the data value exchange, alongside straightforward opt-in and opt-out options as you invest in first-party data.

2. Embrace AI. LinkedIn recommends trying new technologies for scalable targeting and ROI, highlighting its Predictive Audience tool. This feature utilizes LinkedIn’s AI and your data to create custom high-intent audience segments. While emphasizing the importance of human oversight, LinkedIn notes that AI and ML capabilities include:

  • Enhanced targeting and personalization.
  • Predictive analytics.
  • Real-time campaign optimization.
  • Ad personalization based on behavior and preferences for different segments.

3. Reevaluate your reporting strategy. To combat reporting challenges from third-party cookie deprecation, LinkedIn suggests reevaluating approaches like last-click attribution, which may overstate the impact of activities like search and display. Instead, the platform recommends collaborating with your Sales and Finance departments to identify more meaningful metrics.

LinkedIn also advises leveraging CRM data through tools like the Revenue Attribution Report to connect marketing efforts with sales success. Additionally, using first-party Conversion Tracking provides insights into post-click and view-through conversions, facilitating measurement of ad impact and ROI.

4. Test, learn and win. LinkedIn warns that you should not assume what worked in the past will work in a cookieless future. Optimize campaigns by testing, learning, and winning to reduce costs and stay ahead. Evaluate campaign performance comprehensively, including its impact on brand health metrics. Measure your brand campaign’s effectiveness on LinkedIn through brand lift testing, setting baselines, and optimizing through A/B tests for improved performance, ROI, and valuable insights.

5. LinkedIn marketing partner. LinkedIn has partnered with solution and service providers with specialized capabilities in areas like predictive audiences and revenue attribution. The playbook offers their practical solutions and case studies to help you implement these features on LinkedIn. Investing in resilient data and solutions now ensures optimal performance across the marketing funnel amidst privacy changes.

What LinkedIn is saying. Brandon Johnson, Senior Marketing Manager at LinkedIn, said in a statement:

  • “Gartner estimates that by the end of 2024, 75% of the global population will have their personal data covered by privacy regulations.”
  • “Taking steps now to invest in data and solutions that are more resilient to privacy changes will help ensure you are maximizing performance across the marketing funnel.”

Get the daily newsletter search marketers rely on.

Deep dive. Read LinkedIn’s blog post in full or download its new guide for more information.

Original source:

Insight Valuations LLC: Full-Scale Business Appraisals at Lower Fees

Home Business Magazine Online

Insight Valuations LLC provides fair market value appraisals of businesses, including whole and partial interests at the subject company’s enterprise or equity level. By having the right-sized resource pool, we can provide appraisals at a lower cost than the larger regional and national valuation firms while still maintaining excellent levels of service and quality.

Most importantly, our appraisals are performed by qualified and credentialed business valuation specialists who produce reports that meet the reporting standards typically required by outside parties such as the U.S. Treasury (IRS) and are generally sought out by legal, financial planning, and other professional advisors.

Whether you want to get an idea of how much your whole company is worth for
corporate planning purposes or are in the midst of estate and gift tax planning and need to know the value of the single share or 1% equity interest in your company on a non-controlling and non-marketable basis, We Can Value That™.

Comprehensive Valuation Services Tailored to Your Needs with Competitive Pricing

While Insight Valuations is based in Houston, Texas, the U.S. energy capital, our experience includes serving clients across the United States and in a broad range of industries, including family-operated furniture manufacturers, local to global-sized oil and gas equipment manufacturers, and field service companies, pre-clinical bio/pharmaceutical companies, Midwestern architectural and engineering design firms and utility service providers in New England.

For simpler engagements, we offer restricted reports that may only include a subset of the
range of valuation approaches and methods typically used in a full-scale report. Where a larger valuation firm may offer such a service at a minimum fee of $10,000, this would likely fall in the range of $3,000 to $6,000 with Insight Valuations. For more complex valuation engagements, such as valuing one common share of a privately-held company on a non-controlling and non-marketable basis by utilizing methods within the Income, Market, and Cost approaches, the applicable fee range with Insight Valuations may be $7,000 to $12,000, depending on the engagement purpose and intended users of the valuation report. The fee range for the same engagement at a larger firm would likely be $15,000 to $25,000. For a low flat fee, Insight Valuation also offers quarterly letter summary valuation updates to supplement the initial engagement report so you can track the value of the subject interest over the subsequent year.

Affordable and Accessible Valuation Services with Insight Valuations LLC

So, why is there such a difference in our fee level? Larger valuation firms are equipped to manage almost any valuation need beyond basic business valuations by having data subscriptions to research platforms that provide global and international market data feeds. While this level of resources allows these firms to address 99% of all possible valuation needs, it creates an overhead cost structure that causes these firms to have de facto minimum fee requirements.

Quite often, these minimum fees set by the service provider surpass the maximum fees that small business owners can pay.

The cost to value your company should not materially impact your bottom line. Insight
Valuations maintains a right-sized data resource pool with a focus on U.S.-based market data that allows us to drastically reduce our cost structure to serve the majority of the potential clients who would need our services. Our costs do not define our clients.

And to the 99% of those potential clients who have had to hold off from getting a valuation
analysis performed because of the cost…

We Can Value That™

Please visit us and reach us via email at or by phone at (346) 554-3706.

The post Insight Valuations LLC: Full-Scale Business Appraisals at Lower Fees appeared first on Home Business Magazine.

Original source:

Google investigating Local Services Ads bug

Google is investigating a Local Services Ads bug that’s causing businesses to appear in search results for their competitors’ names.

The issue comes just weeks after the platform started piloting branded Local Service Ads (LSA) called Direct Business Search. This new ad type allows brands to show up in search results when someone directly searches for a brand or business.

Google has clarified that through Direct Business Search, only your business should appear in search results when your brand is queried, and charges will be incurred solely for leads from new customers.

Why we care. When users search for a specific brand on Google, their intent is to find that brand, not a general service or product. If your competitors appear in the search results for queries related to your business’s name, Google not only fails to fulfill user intent but also risks directing your customers to competitors.

First spotted. The issue was first flagged by Joy Hawkins, SEO consultant and founder of Sterling Sky Inc, on X:

  • “I wonder how many of these businesses know they’re bidding on a competitor’s name. Really dislike this rollout with Local Services Ads. I don’t see how it’s better for anyone.”

Hawkins shared a screenshot of the Google SERP for the query “a1 garage door service” in January 2024:

Ads Liaison Officer

Highlighting the impact of the LSA bug, she then shared a screenshot for the same search query in February 2024:

Alphabet Inc.

Opting out. Although brands are automatically enrolled into Direct Business Search ads by default, you can disable it by manually adjusting the settings for your campaign.

What Google is saying. Google Ads Liaison Officer Ginny Marvin responded to Hawkins’ post on X, writing:

  • “This is a bug (and it was thought to be a non-brand query) – a fix is underway.”

Get the daily newsletter search marketers rely on.

Deep dive. Read our article on Google’s Branded Local Service Ads pilot for more information.

Original source:

Lookback: Google Panda algorithm update launched 13 years ago

It was 13 years ago today that Google launched the Panda update. This algorithm update had a massive and far-reaching impact on SEO and content strategies for years to come.

Here’s a complete history of the Google Panda algorithm update.

What was the Google Panda update?

Google Panda was a major algorithm update that impacted search rankings for 11.8% of queries in the U.S.

Google said Panda was designed to reduce the rankings for low-quality sites (“sites which are low-value add for users, copy content from other websites or sites that are just not very useful”) and reward better rankings to high-quality sites (“sites with original content and information such as research, in-depth reports, thoughtful analysis and so on”).

The algorithm officially started rolling out on Feb. 23, 2011, but it wasn’t until Feb. 24, 2011, that we learned about it.

Panda didn’t only impact content farms. Websites of all sizes, across industries, felt the wrath of Panda.

There were multiple reports of companies going out of business because they had relied on Google traffic for years. Once that organic traffic was gone, so was their business.

After its initial launch, Google announced several refreshes and updates to Panda on a near-monthly basis for the next two years (9 in 2011 and 14 in 2012). You will find a complete timeline and our coverage of those updates at the end of this guide.

One thing Google Panda was not: a penalty. Websites impacted by the Google Panda Update were downgraded algorithmically. This means no Google employee took manual action to cause the lower rankings.

For anyone hit by Panda, it probably felt like a penalty. But Google doesn’t consider downgrading rankings for a website algorithmically as a “penalty.”

Why was Google Panda launched?

The main target of Google Panda was low-quality content. In 2011, Google was trying to find a solution for its content farm problem.

In case you’re not familiar with the term, a “content farm” was a website that paid freelance writers (typically poorly) to pump out as much content as fast as possible. The sole goal for these companies was to rank extremely well in Google’s search results for high-traffic keywords. Article quality was typically low.

Those high rankings meant these content farms got lots of traffic. And they monetized that traffic via display advertising (ironically, many of those sites were using Google AdSense.) Demand Media is probably the best example of the classic content farm model. Suite 101 was another big content farm.

Interestingly, many of the content issues Google was trying to solve in 2010 had come about after the Caffeine Update. Google was now crawling content faster, and its index grew rapidly. But it also meant some “shallow” content was ranking prominently.

Business Insider even published an article with this rather blunt headline: Google’s Search Algorithm Has Been Ruined, Time To Move Back To Curation.” On TechCrunch, there was: “Why We Desperately Need a New (and Better) Google.”

Well, on Feb. 24, 2011, Google rocked the world of content farms – and the entire SEO industry. Google now had a way to look for signals that differentiate high-quality and low-quality sites.

Get the daily newsletter search marketers rely on.

Why did Google name it Panda?

In an interview with Wired, Amit Singhal, head of Google search, revealed where the Panda name came from:

“Well, we named it internally after an engineer, and his name is Panda. So internally we called a big Panda. He was one of the key guys. He basically came up with the breakthrough a few months back that made it possible.”

– Google’s Amit Singhal in 2011

The full name of that Google engineer is Navneet Panda.

However, the update wasn’t referred to as “Panda” outside of Google when it initially launched. Search Engine Land founder Danny Sullivan called it the “Farmer’ update – until the Panda name was revealed in Wired. So if you see the Farmer name pop up in any of our old coverage of Panda, that’s why.

Google Panda algorithm explained: how it worked

Much speculation followed the arrival of Google Panda. Most of that is irrelevant today, so we won’t rehash the many Panda myths in this guide.

Here’s a rundown of what we learned about how Google Panda worked and what made it unique.

Panda was an algorithm applied to sites

The most important thing you need to understand about Panda is that it evaluates the overall quality of the entire website.

Here’s what Gary Illyes, Google webmaster trends analyst, told Search Engine Land in a 2016 interview:

“… we don’t think of Panda as a penalty now, but rather as an algorithm applied to sites … or sites as a whole.

It measures the quality of a site pretty much by looking at the vast majority of the pages at least. But essentially allows us to take quality of the whole site into account when ranking pages from that particular site and adjust the ranking accordingly for the pages.”rea

Google’s Gary Illyes, in 2016

So what Illyes confirmed is that Panda demotes content, essentially meaning low-quality content can harm ranking.

Google’s John Mueller also said that Panda looks at site architecture to assess website quality.

And Panda also assigned a sitewide score, according to Mueller.

Did removing or improving low-quality content help?

On March 8, 2011, Michael Wyszomierski, a member of the Google webspam team, posted a message in a Webmaster Central help forum thread:

“Our recent update is designed to reduce rankings for low-quality sites, so the key thing for webmasters to do is make sure their sites are the highest quality possible. We looked at a variety of signals to detect low quality sites. Bear in mind that people searching on Google typically don’t want to see shallow or poorly written content, content that’s copied from other websites, or information that are just not that useful. In addition, it’s important for webmasters to know that low quality content on part of a site can impact a site’s ranking as a whole. For this reason, if you believe you’ve been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”

– Google’s Michael Wyszomierski, in 2011

What’s important to note here about Panda:

  • It can identify content that is shallow, poorly written, or copied from other websites.
  • Poorly written content can hurt rankings.
  • Removing low-quality pages from part of a website can help high-quality pages rank better.
  • Improving content (making it “useful”) can help your rankings.

Later on, Google tried to walk back the idea of removing content. Google started recommending adding high-quality content and fixing the low-quality content instead. This is most likely because many in SEO started taking a metaphorical blowtorch to their content instead of a scalpel.

At SMX East 2017, Illyes said pruning content didn’t help with Panda.

“It’s very likely that you did not get Pandalyzed because of your low-quality content. It’s more about ensuring the content that is actually ranking doesn’t rank higher than it should. …  It definitely does not work with the current version of the core algorithm, and it may just bring your traffic farther down. Panda basically disregards things you do to rank artificially. You should spend resources on improving content instead, but if you don’t have the means to do that, maybe remove it instead.”

– Google’s Gary Illyes, in 2017

Google on how to evaluate whether a site was ‘high-quality’

On May 6, 2011, Google published a list of 23 questions to explain how Google searches for high-quality sites, to help those that Panda had been negatively impacted:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?

How to recover from Panda

The big question for anyone whose site was hit by Panda: How do I recover? Aside from pointing to the blog post with those 23 questions, Google provided a little more information over time.

On July 9, 2012, during a Google Search Central hangout from India, Google’s Matt Cutts said it was possible to recover 100%. How?

“Take a fresh look and basically ask yourself, ‘How compelling is my site?’ We’re looking for high quality. We’re looking for something where you land on it, you’re really happy, the sort of thing where you wanna tell your friends about it and come back to it, bookmark it. It’s just incredibly useful.”

Google’s Matt Cutts in 2012

Here’s the video:

Cutts also provided some high-level insight in another Google Search Central video released Sept. 11, 2013:

“… the overriding kind of goal is to try to make sure that you’ve got high-quality content, the sort of content that people really enjoy, that’s compelling, the sort of thing that they’ll love to read that you might see in a magazine or in a book, and that people would refer back to, or send friends to, those sorts of things,” Cutts said. “So that would be the overriding goal. …

So if you are not ranking as highly as you were in the past, overall, it’s always a good idea to think about, OK, can I look at the quality of the content on my site? Is there stuff that’s derivative, or scraped, or duplicate, and just not as useful?”

Google’s Matt Cutts in 2013

Here’s the video:

Google still uses Panda (sort of)

On Jan. 12, 2016, we reported that Panda had become part of Google’s core algorithm. Google never confirmed an exact date when Panda was consumed into Google’s core ranking system.

All we know with 100% certainty is that it happened at some point in 2015 (according to Google’s guide to Google Search ranking systems) after Google’s last confirmed Panda update (4.2, on July 17, 2015).

In June 2016, Mueller said that Google’s search engineers still tweaked aspects of Panda, but it was fundamentally the same algorithm.

Although Panda was continuously running, Panda wasn’t real-time and the rollouts took months, Illyes confirmed at SMX Advanced 2016.

Panda was replaced by Coati

Technically, Panda no longer exists. That’s because Panda evolved into a new algorithm called Coati.

Basically, Google Coati was the successor to the Panda algorithm. We learned this from Hyung-Jin Kim, the VP of Google Search, in November at SMX Next 2022.

So even though it’s been years since Panda was replaced, Panda lived on through Coati, as part of Google’s core algorithm.

So that’s why I say Google sort of still uses Panda – because Google’s core algorithm today continues to do many of the same things that Panda was originally created to do in 2011.

The legacy of Google Panda

Google Panda is still impacting SEO to this day, more than a decade after it launched. Here are just three big changes that came from Panda:

  • E-E-A-T. The concept of Expertise, Authoritativeness, and Trustworthiness (plus the additional E for Experience, as found in the quality rater guidelines) can be traced back to the content farm problem. Google always wants to reward websites that publish high-quality content by experts who are authorities on a subject. The quality rater guidelines are the ultimate guide for how not to be a content farm.
  • Content marketing. Content marketing, as a term, wasn’t really a thing before Panda. Just go to Google Trends and you’ll see when the term started gaining popularity, and not by coincidence. The term itself may date back to 1996 (and the concept itself is as old as marketing), but content marketing as its own thing was essentially born as a result of Panda.
Alphabet Inc.
  • Differentiate or die. Perhaps the biggest lesson coming out of Google Panda was to never put all your eggs in one basket. Especially not Google’s. Relying on any one channel for all, or nearly all, of your traffic and income is not a business plan. It is a gamble. Never leave yourself at the mercy of any one channel. or platform – because you have no control over a third party. If one algorithm change can wipe out your entire company overnight, you’re doing it wrong.

A complete timeline of the Google Panda Updates

Below is the complete list of all 28 confirmed Panda updates and refreshes. Oh, and since we didn’t get into the difference between an update versus a refresh, here’s what that means: an update was a change to the Google Panda algorithm, whereas a refresh meant Google reapplied all the same signals from the most recent update.

Here’s Search Engine Land’s coverage of Panda, from 2011 to 2016:

Feb. 23, 2011: Panda Update 1.0

April 11, 2011: Google Panda Update 2.0

May 9, 2011: Panda Update 2.1

June 21, 2011: Panda Update 2.2

July 23, 2011: Panda Update 2.3

Aug. 12, 2011: Panda Update 2.4

Sept. 28, 2011: Panda Update 2.5

Oct. 19, 2011: Panda Update 3.0

Nov. 18, 2011: Panda Update 3.1

Jan. 18, 2012: Panda Update 3.2

Feb. 27, 2012: Panda Update 3.3

March 23, 2012: Panda Update 3.4

April 19, 2012: Panda Update 3.5

April 27, 2012: Panda Update 3.6

June 8, 2012: Panda Update 3.7

June 25, 2012: Panda Update 3.8

July 24, 2012: Panda Update 3.9

Aug. 20, 2012: Panda Update 3.9.1

Sept. 18, 2012: Panda Update 3.9.2

Sept. 27, 2012: Panda Update #20

Nov. 5, 2012: Panda Update #21

Nov. 21, 2012: Panda Update #22

Dec. 21, 2012: Panda Update #23

Jan. 22, 2013: Panda Update #24

March 14, 2013: Panda Update #25

July 18, 2013: Panda Update #26

May 20, 2014: Panda Update 4.0

Sept. 23, 2014: Panda Update 4.1

July 17, 2015: Panda Update 4.2

Original source:

+ +