Search engines not indexing any of my websites

| SYSTEM INFORMATION |
|------------------------------|-------------------------------|
| OS type and version | Debian 11 |
| Webmin version | 2.013 |

I tried on two different servers with two different companies in two different countries, but I can’t seem to figure out how to make search engines index my websites. Can someone help me? I’ve tried robots.txt, sitemaps… nothing.

Not really a Webmin question. I know you used to be able to register directly with search engines. They start you at the top and then rapidly drop you if no one using searches clicks on the links.

If you go to Virtualmin/logs/awstats you don’t see any activity from bots? Kinda surprising really. My sites with just the default web page has hits from search engines.

No bots, nothing. I have 5 websites, of which one is quite popular, and no one is indexed. I can’t for the life of me understand why. This has been going on for 5 months now and I’ve actually tried everything I can think of. Kinda wanted to ask other people about it.

Have you used Google Search Console to get into googles search engine.
https://search.google.com/search-console/about

Yes, that was my last hope but it doesn’t seem to be working either.

In the search console, there is URL Inspect, that will tell you if its in there, if its not getting indexed it will give you a reason.

I’d like for websites to be indexed even if I’m not using Google Search Console and with other search engines too…

Well you need to go to each service and add the sites to them like you did with google.
Google is the big one though.
you have created a sitemap for each website?
search “add my url”, there heaps on the subject.

I think your sites are in the engines, they maybe just not rank very high, that’s more SEO.
On google or binge, search for content on your site but use this before it “site:yoursite.com” that will confine the search to your site only. That’s a easy way to see it is the search engine.

It doesn’t appear in search results even by using “site:”

Please use google search console, this is really not a Webmin issue.

You’ve neglected to say how long your sites have been up. It can take weeks before anybody even crawls your site, let alone indexes it.

The number one reason some sites are never registered is gang submissions and cross linking same sites. Those old strategies will keep you OFF of indexes these days.

If you’re trying to game the system that way, you’re doing more harm than good.

Moved to General Discussion.

Richard

As others have said, Google Search Console is helpful for identifying problems. Fixing them… sometimes yes, sometimes no. It depends on the problems.

Part of the overall problem is that Google’s robots increasingly seem to care about things that literally no one else does, and that often make little or no difference to human users. That’s where the Search Console comes in handy. It draws attention to “flaws” that no human in their right mind would consider to be such.

What I find more helpful, and of much more value to humans, is Page Speed Insights. I rarely fail to find something that can be improved when I run pages through that app.

That’s about all I can say without knowing what the sites in question are. And honestly, I really don’t pay much attention to SEO anymore, anyway. I write for people, not bots; so I’m probably not the best one to ask for SEO advice.

Richard

They’ve been up for about a month now, give or take

I really don’t know what to say… It seems I’ve literally tried everything. Even PageSpeed Insights :frowning:

That’s not long enough, especially if you’ve constantly messed with it. The more you mess with it, the more you’ll be ignored.

All you should do is submit your sitemap to Google and then submit your index (home) page and wait. It’s that simple.

And if you’re cross linking, get rid of it. They’ll pass you by for that.

1 Like

Please just use Google search console and the url Inspect and screenshot the result
you should see this, if there are issues, fix them and then Request Indexing as seen in the screenshot.

That’s an example of one of the dilemmas when considering what search engine bots want versus what’s best for humans.

I had a site dealing with a certain topic that also contained a section of pages about a peripherally-related topic. Over time I found that the pages about the related topic were getting more traffic than the main site, so I decided to move them to a site of their own.

I used 301 redirects in .htaccess for the individual pages like you’re supposed to; but I also included a single link from the old site to the new site explaining that those pages had been spun off. I also included a link on the new site back to the old one for visitors who were looking for that content.

That, of course, was a no-no as far as Google’s bots were concerned. It took forever for the new site to be indexed; and even after it was, it languished way down in the rankings until I removed the links. They also caused rankings on the old site to drop quite a bit.

The links were good for humans. They were there as a convenience to those who landed on the “wrong” site and were looking for the content on the “right” site. But I had to remove the links because Google’s bots didn’t like them.

Nowadays I pretty much ignore search engines on sites that I personally own. Search engines all suck these days anyway. They all try to guess what I really meant to search for instead of just returning the results for the damned query as entered. They’re dumbed down for idiots to the point of uselessness.

Even Google’s “Verbatim” isn’t really verbatim anymore. It’s better than using Google without it, but it’s not as good as it used to be. It spits out a few results as if it’s annoyed that you decided to use it, and ignores thousands or millions more that you know exist out there.

When I search as average users do, I get pages upon pages of link farms and other irrelevant bullshit. I can only come close to getting relevant results when I resort to quotes, brackets, asterisks, negative keywords, and other hacks that most users probably don’t even know exist. Am I supposed to optimize for irrelevancy?

Most of my niche sites’ traffic comes from other sites and forums in those respective niches, and that’s fine with me. Actual humans seem to like my sites just fine, so I no longer worry about what Google et al. think. I don’t even bother checking. I no longer care, and I’m not going to waste my time.

Unfortunately, I don’t have that same luxury with client sites.

Richard

2 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.