Homestead's Search Engine Optimization

  • 4
  • 1
  • Article
  • Updated 4 months ago
  • (Edited)
A couple weeks ago we introduced you to our Full Service package. A designed website coupled with Search Engine Optimization. It's a great service, but what if you already created a spectacular site on your own? No problem. Read on...

The Importance of Search Engine Optimization

SEO, Search Engine Optimization, is a vital piece in making sure people can find your website in search engines. These days more and more people are using the internet to search for local services and products making it important that your site can be found by your potential customers.

SEO in a nutshell is following search engine’s best practices and keyword research to help people discover your website when they do a search online. The higher your website appears in the results, the more likely you will receive traffic and visitors to your website. Below are some tips you can use in order to make sure your website has the best possible chance at getting found online.

1.       Choose the right keywords. First knowing how people search for your business will help determine what to focus on within your website as well as externally. There are some great tools such as Google Keyword Planner or Ahrefs that will give you insights into the most popular terms and an estimate of traffic to your site if you got to the first page of Google. There are other factors such as the competition levels of the keyword, but this is at least a good start into knowing your audience so you plan accordingly.

2.       Make your site relevant for keywords. Once you have determined which phrases you want to target, the next step is adding them into your website. Search engines like Google look at everything to determine what to rank you for. Some places to insert your keyword are:


  • Content – Writing about your services / products is the easiest way to help search engine pick up keywords for your website. The more you can write, the more relevant Google will think you are. Make sure you make your content easy to read, but focus around the keywords you want to rank for. This may mean you will need to create a few more pages in order to cover all your products and services. Also DON’T just stuff keywords into your pages. Search engine want to see that your content is readable for a good visitor experience. They know when you are trying to beat the system and can penalize you for it.
  • Meta Data – There are places where you can enter keywords into the backend of your website. Specifically you want to focus on adding good keywords into the Page Titles of each page of your site. Make sure the keywords are relevant to that page. Ex. If you are trying to rank for Plumber Phoenix, have that phrase be optimized on a plumbing page or services page. If those keywords show up in the page title and content of that specific page, you will have a better chance at ranking for it.
 3.       Start Building trust. Once your website is relevant for the keywords you want to target, the last step is building backlinks to your site. A backlink is getting another website online to post a link on their website that goes back to yours. The more of these you can build, the more trust a search engine will have for your site and the higher you will rank for the keywords you want to target. Places you can build backlinks are:

a.       Directories – Try and find directories within your industry.

b.      Articles Sites  - Having written articles about your industry will not only help you be an expert in your field, but also build trust to increase your rankings.

c.       Social Networks  - Building an audience through Facebook is a great way to help increase your brand awareness and build backlinks to your website anytime you make a post.


There are thousands of sites you can submit your website to in order to get backlinks to your website. This can be a fairly time consuming task, but is needed in order to outrank your competitors. Make sure you look for quality sites to submit your website to and keep at it. Every month you will want to continue to build trust until you start seeing results.

Let Us Do This For You

SEO

If you would rather have some professionals increase your rankings online, our team of experts are here for you. Work with a dedicated marketing manager who will work with you on building traffic to your website. This process involves:

-          Initial welcome call to go over your business goals and who an ideal customer is.

-          Keyword research to determine what people are typing into search engines to find you.

-          Implementing the relevant keywords and phrases into your website

o   Page Titles and Meta Descriptions

o   H1 Tags

o   Content Density

-          Monthly backlinking to start moving your website up in rankings


SEO is an ongoing task so it’s important that we continue to work on your site and build trust to get you to the top of the search engines. Your dedicated marketing manager will work with you closely during the entire process and even go over monthly progress so you know where you stand. A marketing portal will be set up for you so you can also view your traffic and rankings anytime.

Take the guesswork out of growing your online business and let us help get you to where you need to be. You are the expert in your industry. Let us be the expert in helping people find you.
Photo of Chris Folmer

Chris Folmer, SEO Consultant

  • 100 Points 100 badge 2x thumb

Posted 10 months ago

  • 4
  • 1
Photo of Gayle6986

Gayle6986

  • 60 Points
Hello Chris.  I have the website Work Of Art Greeley.  I would like help with SEO please give me a call to discuss.  Thank you!
https://www.workofartgreeley.com/
Photo of Elyzabeth

Elyzabeth , Official Rep

  • 44,918 Points 20k badge 2x thumb
Hello Gayle,

I will have someone give you a call to discuss SEO and pricing!
Photo of Ken0163

Ken0163

  • 100 Points 100 badge 2x thumb
Chris, how much does this seo cost? www.firstclassballoons.com is our website. thanks
Photo of Elyzabeth

Elyzabeth , Official Rep

  • 44,918 Points 20k badge 2x thumb
Hello Ken,

I'm going to have someone give you a call to discuss pricing and the different options we offer.
Photo of Becky0319

Becky0319

  • 70 Points
Ok.. I would like help with SEO as well.

www.stlouisaustralianlabradoodles.co,
Photo of Elyzabeth

Elyzabeth , Official Rep

  • 44,918 Points 20k badge 2x thumb
Hi Becky,

I'll have someone reach out to you about this!
Photo of Lisaprimps0203

Lisaprimps0203

  • 80 Points 75 badge 2x thumb
Hello please have someone call me today to discuss SEO.
Photo of Elyzabeth

Elyzabeth , Official Rep

  • 44,918 Points 20k badge 2x thumb
Hello,

I can see you called and were directed to the proper department. Let me know if you have any other questions.
Photo of Ken0163

Ken0163

  • 100 Points 100 badge 2x thumb
please call me at 804-359-3679
Photo of Elyzabeth

Elyzabeth , Official Rep

  • 44,918 Points 20k badge 2x thumb
I've submitted a request to have someone contact you
Photo of Dawn4226

Dawn4226

  • 384 Points 250 badge 2x thumb
I would like someone to get in touch with me about SEO, thanks.  triptrike@live.com 
Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
Hello Dawn,

I can schedule a SEO  agent to reach out and discuss options. What is a good phone number to have them contact you at? 
Photo of Dawn4226

Dawn4226

  • 384 Points 250 badge 2x thumb
I'm sorry, got it taken care of elsewhere.  Thanks, have a blessed day.
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
Hi Homestead,

I have some questions related to search engine rankings. I just posted a question about sitemaps separately. But here are some things that jump out at me looking at the Google Search Console (this is concerning all my sites here at homestead), but specifically www.accessrite.com.
1. It's well known that Google prefers secure (HTTPS:..) sites in terms of rankings. Is there a way to make my site secure? I note that Homestead.com (your home page) is secure.
2. The search console is excluding all or most of my pages as duplicates without user-selected canonical. I think this is because its seeing them in both mobile and non-mobile form and I noticed that they are being hit by the mobile bot.
3. In this case, I really want to mark the non-mobile versions of my pages as canonical as the mobile pages exclude some important information needed for my PC-based audience.
4. I forced google to crawl/index my pages by manually submitting them each individually, but it needs to be automatic and per the sitemap and for some reason I think the mobile bot (Googlebot smartphone) is tripping up on the site map. I may be completely off-base on this, but something isn't working well.

Thanks
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
Scratch point 1 above about HTTPS: Jake helped me out with that. It would be helpful (i think) if you edited the main article to explain the process (call support, or whatever) for users of website builder. Now, I suppose I'll have to resubmit all my pages to Google for indexing since the URLs will change?

That just leaves 2-3 to answer, as well as the maybe related sitemap questions posted elsewhere this morning. Thanks!
Photo of Michelle C

Michelle C, Employee

  • 13,758 Points 10k badge 2x thumb
Hello,

2.The reason you are seeing that is because WebsiteBuilder does not have view port enabled. This will not have any negative impact on your seo rankings. 
3. Unfortunately there is no way to use a canonical for one version as the desktop and mobile versions use the same url. 
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb

Kindly escalate this to the highest levels of engineering possible. Houston, we have a problem.

This is a consolidated discussion concerning Google Indexing and subsequent ranking along with another topic concerning sitemaps. I got my engineering degree many years ago and gradually moved to software development over the last few decades. I started building my own websites in the early 2000’s and researched SEO heavily in the mid-2000’s. Along the way, at times I’ve achieved stellar search engine rankings. I know a fair amount about the importance of keywords and content in SEO, but in the last 10 years, I’ve been busy with my software, so I admit my SEO skills aren’t that current. As an aside and apology in advance, if I sound like a grumpy old man, well, I am old at 64, and looking at the things below, I’m getting very grumpy.

Recently I’ve noticed I can’t get organic rankings no matter what I do and have resorted to paid ad campaigns to get noticed. I discovered Google Search Console recently and started using it as it seems far superior to the old Webmaster Tools. It gives a lot of in-depth information to help understand issues with one’s sites. So, I’ve spent hours last night looking at every page on every one of my sites as Google Search Console sees them. I also discovered that Bing has a better tool for identifying specific problems with the tags and content, but I had to create a CNAME alias record to verify it for them.

Let’s get this straight: Months ago, I verified all my sites in the GSC and submitted sitemaps of all of them except #3 below which is newer. Now here is the horrible summary of what I’ve found:

1.       Only one page on one site seemed to be Indexed per the sitemap and that was the home page.

2.       Most of the rest of pages were indexed (because I’d manually submitted them) but listed as NOT in the Sitemap.

3.       And, here’s the absolute worse thing: Some of my pages were listed as “UNKNOWN” to Google despite being in the sitemap for a long time. Unknown means the page will never show in any search, and, just as bad, the indexing process can’t associate anything on the page with the content of the other pages on the site!

My conclusion is that something is wrong with the sitemap and the graphics below seem to support my suspicions: The robots.txt files are disallowing the sitemaps, and my research indicates that’s the wrong thing to do.

In the last year, I developed four sites here at Homestead, verified them with Google and submitted sitemaps for all of them. Now that I’ve allowed enough time for Google to crawl them, I went back to look at them. I am very upset to find the most incredibly balled up situation imaginable. I believe there are major issues with the way the homestead sites are appearing to Google and this has fouled up any possibility of decent rankings.

The websites I’m referring to are:

1.       iManageInformed.com where I sell my core product.

2.       iSpecQA.com where I sell a companion application to the core product

3.       Accessrite.com which is a new website to sell programming and development services

4.       HVAC-Scams.com which is an informational website about fraud in the heating and air-conditioning service industry

I found that having a secure website (HTTPS) helps a bit with ranking. For some reason, two of the new sites (1 and 4 above) were set up as secure, but others were not. WHY? I didn’t request secure or not secure initially, so I’m curious as to why half appeared as secure and half did not. Did it having something to do with the templates I chose, the phase of the moon or what? Yesterday I asked that they all be switched to secure, but only 3 above was changed. No matter, I’ll get the other one switched too. Realize this entails more work in Google Ads since the landing page URLs change when switching.

But here is the real problem and it occurs with every one of my sites in the Google Search Console. Google apparently can’t use the sitemaps. The sitemaps do appear to be correct and list the individual page URLs. I’m wondering if the problem traces to the robots.txt file disallowing the sitemaps. This is an example of what I see:


I researched this and some say this was done originally to keep sitemaps from appearing in search results, but others say that problem doesn’t happen anymore, and you are, in effect, telling Google not to “crawl” or look at the sitemap. So, perhaps that’s the problem with what I’m seeing.

Let’s start with www.ispec.com. It has four pages, and all are showing severe issues:

Home Page: Indexed (because I submitted it manually) NOT SUBMITTED IN SITEMAP
http://www.ispecqa.com/the-core: Indexed, NOT SUBMITTED IN SITEMAP
http://www.ispecqa.com/pricing URL IS NOT ON GOOGLE  (WOW) | COVERAGE DUPLICATE WITHOUT USER SELECTED CANONICAL | LAST CRAWL 7/19/2019 GOOGLEBOT DESKTOP
http://www.ispecqa.com/features Indexed, NOT SUBMITTED IN SITEMAP

This is really screwed up! All four pages don’t exist in the sitemap, according to Google, but they sure as hell do! That’s why I’m wondering if that robots.txt disallow statement is a large part of the problem.

Here is another clue pointing to this issue. When I inspect the home page URL, here’s what Google Reports which again, seems to point to that disallow statement in the robots.txt file. This is home page! Google is saying it NOT GOOD: ROBOTS.TXT is blocking it:


So, let’s get this fixed as it may be impacting other issues I found!
Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
Claude, I do want to help to the best of my ability and I am doing what I can to research a resolution for what you may be running into but I hope you do understand this is out of Homestead scope of support. We have tested a few WebsiteBuilder sites on our end and were able to successfully get the sitemap submitted and get everything indexed. We definitely want to help where we can, to start can you please send a screen shot of the sitemaps tab?
(Edited)
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
Thanks, I understand. Perhaps someone changes something on the engineering level since the initial build/testing was done. Feel free to work with me and my sites on testing solutions.
Here is my accessrite.com sitemap as it appears in chrome:

Here is what I'm seeing in GSC (note I had to submit it again today since we switched it to HTTPS)


Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
Thank you for that. If you see a url that was not successfully indexed in GCS under coverage you can click on "inspect url" and then click in the option "request indexing".
Based off the sitemap from your website and what is being pulled into your GSC everything is functioning correctly. There are 10 urls in the site map provided and that coincides with number of urls google is discovering. After provided with the sitemap it is now on google to inspect and decide what they are going to index and Homestead has no control or influence in that process.
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
I disagree: Please read the post again: The pages all were submitted manually for indexing, except the few that I inadvertently missed and those do not appear to exist in Google's search. These should have been discovered automatically per the sitemaps I submitted MONTHS ago. You need to get an engineer or two to look into this and fix it. In the meantime, can you modify the robots.txt file to not disallow the sitemap? Just as a test. This is really frustrating. In the older site builder, there was a file manager that allowed the user a lot more control over what was going on. Let me know if no action to fix this will be taken so I can make plans accordingly. This impacts my livelihood.
Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
I understand and I will get this concern escalated to our engineering team to further investigate. Please allow 24 hours for the team to look into this and you will be informed of any updates.
Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
Hello Claude,

I did want to reach out to you to provide an update on the escalated issue. The ticket has been closed and here is a copy of the resolution. 

"From google search console we see the sitemap submission was successful. This means that google did not find any errors with the sitemap.

Google provides documentation of the status messages and their meaning in their help documentation.https://support.google.com/webmasters/answer/7451001?hl=en

The robots.txt file contains two directives User-agent: and SITEMAP:, The User-agent: directive gives the option to disallow files or directories from indexing, while the SITEMAP: directive defines the sitemap. The sitemap is not being blocked from the robots.txt file we are actually telling search engines where to find the sitemap.

Google does not Guarantee that they will index or crawl all pages whether they are submitted in the sitemap or found organically. https://support.google.com/webmasters/answer/34441?hl=en
The best place to find answers to any google search console questions is on their help page.https://support.google.com/webmasters#topic=9128571https://support.google.com/webmasters#topic=9128571"

Overall, it would be best to use the GSC support to get more information on how their software functions. If there is any technical issues in the future that we are able to assist with, we would be glad to help out! I hope this addresses your concerns.
(Edited)
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
I'm going to do a little test. Reconstructing my site at SquareSpace.com and will move the domain over shortly. Let's give it a month to index and see what happens.

Photo of Michelle C

Michelle C, Employee

  • 14,388 Points 10k badge 2x thumb
Recreating the site else where may do more damage than good only because it may cause google to get even more confused depending on the formatting of their sitemap. However, it is totally up to you.
I do want to explain the reason you are seeing indexed NOT submitted in site map is because the url you are submitting is https://accessrite.com/ and your site map is referencing the url that contains the www as shown highlighted below. However google is still able to relate this to your home page because both urls do reflect your index page but technically they are 2 different urls in googles eyes. 


This is the url your sitemap provides. 


And the url you are submitting is.
(Edited)
Photo of Claude4406

Claude4406

  • 220 Points 100 badge 2x thumb
I promised to follow up on this with an apples-to-apples comparison of the same site built on two platforms. So, while you may delete this post due to the controversy it may incur, at least you'll know you have some work to do, despite what your engineer says about how good your platform may be.

I spent the weekend recreating my site with a different domain and on another platform. I took the opportunity to change the design a bit, but otherwise the two sites, (Homestead: accessrite.com and Wix: accessrightdevelopment.com) are nearly identical in content and SEO-related underpinnings.

I submitted the new site's sitemap on Sunday, Aug 10. GCS reporting lags about three days, so I didn't get any results until yesterday. The single day reported, Aug 11, showed 10X the number of impressions as compared the the older site. Here are the two, both showing the Aug 11 results. First the homestead version:



Next, the newer site. Note total impressions are fewer since it is only showing one, the first day, after indexing, whilst the above it the accumulative of many more days (since 6/22):