The Brand Is What Matters

I liked this post out of Search Engine Journal, which linked to this post: Stop testing, Start shipping. It makes 4 points that I cannot emphasize enough:

  1. SEO is interconnected, not isolated
  2. There are no true control groups
  3. The test takes too long, and the world changes while you wait
  4. You can’t observe most of what matters
https://www.jonoalderson.com/conjecture/stop-testing-start-shipping/

Now I am a numbers and data person. I love using data, building metrics, and so on to figure out the right answer. But when it comes to SEO, I have a hard time believing a lot of the “studies” that come out. I put that in quotes because many of these studies aren’t worth the pixels used to paint the text.

There are so many variables when it comes to SEO, so many moving targets, that it’s almost impossible to allocate credit to what worked and what didn’t work. This is exactly the reason why I advocate working on the brand. Instead of burning time trying to squeeze another half a percent more out of SEO, instead work on burnishing your brand.

Which brings me to this Search Engine Land article commenting on local search, with this gem:

For instance, while past stats suggested 76% of people check out a brand’s online presence before visiting in person, that story is shifting… Which brands stick.

Meanwhile, the basics still matter. Great customer service, strong community ties, memorable in-person experiences—these remain the foundation of building a local audience. What’s new is that these moments now shape what AI sees, what content surfaces in feeds, and what earns trust at scale.

https://searchengineland.com/guide/local-marketing

Yes, absolutely. As we move into an increasingly AI driven world, the easiest thing to quantify is not the quality of blog writing, or if the blog writing is actually true. The easiest thing for AIs to do is to quantify which brands are the most trusted, and then to leverage those brands as the base to generate their text out of.

Which takes me to this Reddit post covering a Ahrefs study: Websites With More Organic Search Traffic Get Mentioned More In AI Search. There should be an obvious tag all over that headline because it makes intuitive sense: Websites with better/more-respected branding are considered more authoritative, and therefore they should get mentioned more.

So bottom line: as we move towards the AI-ification of marketing, the winner takes it all. Take good care of your brand. Keep it squeaky clean and you will see dividends.

Accept Zip File, Extract All and Save To Google Cloud Storage Function

From the “my blog is actually my code backup store” department, this is a simple function I use on Google Cloud to accept a base64-encoded zip file, then unzip it all to a Google Cloud Storage bucket.

import functions_framework
import base64
from google.cloud import storage
import zipfile
import os
import datetime, pytz

@functions_framework.http
def hello_http(request):
    #pull out the base64 encoded data and decode it.
    request_bytes = base64.b64decode(request.get_data())
    print(len(request_bytes))
    print(request.data)
    print(request.content_length)
    print(request.method)
    print(request.method)
    print("request load: " + str(len(request_bytes)))
    #generate working directory prefix name
    datetime_string = datetime.datetime.now(pytz.timezone("US/Central")).isoformat()
    print("Current Chicago Date-Time: %s" % (datetime_string))
    directory_prefix = datetime_string[:19].replace(":", "-")
    #dump to GCS
    gcs_client = storage.Client()
    gcs_bucket = gcs_client.get_bucket("bucket name goes here")
    file_blob = storage.Blob("/zipped/" + directory_prefix + ".zip", gcs_bucket)
    file_blob.upload_from_string(request_bytes, content_type="application/zip", client=gcs_client)
    #dump to temporary directory within functions
    with open("/tmp/" + directory_prefix +  ".zip", "wb") as file:
      file.write(request_bytes)
      file.flush()
      file.close()
    #extract zipfile
    zip_file = zipfile.ZipFile("/tmp/" + directory_prefix + ".zip")
    zip_file.extractall("/tmp/local/unzip/" + directory_prefix + "/")
    #go through extracted files
    for file_name in os.listdir("/tmp/local/unzip/" + directory_prefix + "/"):
      print(str(file_name))
      file_blob = storage.Blob("/open/" + directory_prefix + "/" + file_name + "", gcs_bucket)
      file_blob.upload_from_filename("/tmp/local/unzip/" + directory_prefix + "/" + file_name + "", client=gcs_client)
    print("end")
    return str(len(request_bytes))

You may want to alter the date reference (the US/Central note) but otherwise it’s a small and efficient tool for moving data where I can easily reference it later by date.