I have maintained for years that the European elites are way to far ahead of the common people in trying to form a centralized European government, “the EU”, at the expense of national sovereignty and identity.  This is especially true in the former Warsaw Pact countries which only regained their sovereignty back in the 1990’s.  The populations there are very sensitive to the notion of once again losing control of their own nation states and being inundated, once again with outsiders.  The experience of Soviet invasion and occupation makes all outsiders that want to stay suspect.

My theory is that the rise of nationalism and populism in Europe is a reaction to the EU itself.  People are feeling pushed into a Union they don’t like or feel comfortable with and with neighbors who are welcome to visit but not to live with.  Because the EU means losing identity as a nation state. It means loss of sovereignty.  If the nation state is no longer sovereign that means the vote of individuals counts for less, since their own parliaments and courts and laws can be nullified by unelected, nameless bureaucrats in Brussels.  It goes against self-determination.  This is causing deep anxiety.  The fear of loss of national identity plus control of their own fate triggers the rise of nationalism and populism.

The example is clear: Brexit, at it’s core, is about national sovereignty not economics.

The solution is not to strengthen the centralized powers of the EU.  That will provoke and even greater backlash.  The solution is to slow down centralization of EU power and give people more time to get used to the idea. This may take generations.

This is not going to go well.  Looking at Europe as an outsider and from a historical view, I still see a bunch of damn tribes all looking to maintain their little patch of soil they call home.  You mess with that at your peril.

 

Liked this post? Follow this blog to get more. Follow

I’ve built a lot of different niche web directories over the years.  Frankly there are some that I have forgotten about.  But here are some highlights.

Planet Doom – 2001 – 2005 this was my first directory/portal.  I learned a lot operating it and it had quite a lot of traffic.  At the time, Google was pretty poor at delivering results for niche searches so they would often put niche directory categories in their SERPs.  In effect they would “hand off” the search to the specialists.  As Google got better at it they moved away from featuring niche directories in the results.  Template – custom. Logo – custom both by Lynne Scott.

I went with Scifi/Fantasy/Horror in order to make myself a little different from most of my competition who were doing just Scifi and Fantasy.  Doing all three never was a good fit.

This is the directory that got carpet bombed by Google after my host, Searchking.com, sued Google.

Scifimatter.com – 2003 – 2012  Once it became clear that Google would never send traffic to Planet Doom (above) ever again, I took the backup database from Planet Doom, discarded the Horror listings and created Scifimatter.  The earliest version used a free version of the Gossamer Threads script which had a flat file database.  Later versions (pictured) used WSNLinks, if I remember right.  At various times this directory offered webring hosting and banner exchanges for webmasters.  By about 2008 it became clear that directories had had their day.  I stubbornly hung on to this neglected site until 2012 when I finally pulled the plug.  This was my favorite.  Template – off the shelf.

Shadowdark.org – This was a catchall domain.  From it I ran lots of different perl and php scripts on subdomains to save some hosting fees.  The sites were of all sorts of genres.  Three directories stand out and are listed below.

Planet Doom II –  experimenting, I revived the Planet Doom name as a Horror, Dark Fantasy and Scifi Monster Directory.  I used the free version of Fluid Dynamics Search Engine which worked pretty good for search but was hard to administer. The categories are fake, if you hover over a category you see it triggers a search engine search.  It did okay but was never super popular.  Ironically, today, using a site search engine script, something like this might make a better niche directory than a directory script.  PD II might have been ahead of it’s time.  Template – off the shelf.

The Ring Codex – I rushed this out to take advantage of all the hype about Peter Jackson’s Lord of the Rings movies which were forthcoming with fevered anticipation.  I used a free version of the Gossamer Threads perl script.  Codex had a lot of people using it.  I should have spent more effort on promotion and building the index.  The script had no captcha protection on the Add URL form so it eventually attracted automated spam submissions.  I think at one point I was manually deleting, one at a time, 200 spam submissions a day, with very few legit submissions.  Rather than spend money on upgrades, I closed it down.  Template – off the shelf.

Spy Fiction Guide – I was getting burned out on SF/F/H,  but espionage fiction had always been a favorite genre of mine.  With the fall of the Berlin Wall and eventual collapse of the USSR the spy genre seemed to be dying out. So this was sort of a labor of love on my part to keep the genre alive.  It was never wildly popular but I had fun.

The index of most of these were built by me using a dial-up internet connection which was a very time consuming process.  Niche directories were at their best when people were still building serious websites on Geocities, Tripod, Angelfire and the other free web hosts.

If you have read this far, I thank you.  This post was an itch I had to get out of my system.

 

This was also posted to
/en/linking.

Liked this post? Follow this blog to get more. Follow

Searchking.com had two divisions: 1. A search engine, 2. remotely hosted web directories (think WordPress.com only for directories,) run by different individuals. This article will deal with the search engine alone.  The Searchking search engine I am going to talk about is not the Searchking directory that exists today even though it is using a similar concept in presentation.

This is all from memory having used it quite a lot.  I have no insider knowledge of how it worked or was administrated.

What was Searchking?

Searchking was a minor search engine born at a time when there were 7 – 8 major search engines, many dozens of minor search engines, and many hundreds of directories.  It would seem very crude by today’s standards, but back in 1997 it was on par with many of the majors and almost all the minors for search quality.  It was designed by Searchking CEO Bob Massa and coded by Sargeant Hatch.

The reality was SearchKing search engine was not really a search engine.  It had no crawler other than, maybe a meta tag grabber, my memory is a little cloudy on that.  It was, really a flat directory.  That is it had no hierarchy of categories.  All it searched was the Title, Description and Keywords submitted for each page.  I’m sure there was a rudimentary algo, for example, giving more weight to a keyword in the tile than in a description, but that was it.

How Did it Work?

For the person searching, it appeared to be a search engine.  There was just a searchbox.  When you searched, you got a SERP with 10 results on page one, ten more on page two.  When you clicked through to a site it was shown to you with a frame at either the top or bottom.  On the frame you could vote for the quality of the site.  You could also report spam or dismiss the frame.  The quality votes also effected ranking.  The reality was that very few searchers ever voted.  This means a key feature of the algo was rarely used.  The spam report was used a bit.  Mostly, people dismissed the frame as quickly as they could.

There was some human review.  I think Admins running the search engine kept a weather eye on the submissions and did some random checks.  They did act on reports.  But remember, the Web was wide open, wild frontier when this was built. Nobody knew what worked and what didn’t.  Even some of the major search engines like Infoseek mainly indexed only meta tags and maybe a little on page text for that one page.  Hardly a deep crawl.

Submitting your website to Searchking really meant submitting each page by hand, manually.  Again keep in mind, there were no CMS’s yet nor blog platforms in general use, so most websites were hand coded in HTML. Plus everyone was on dialup which was slow.  So websites tended to be 5 – 25 static pages.  Because of the slowness of dialup internet, we all tended to keep Titles, Descriptions and keywords short.  Search too was rudimentary everywhere.  Keyword searches were one or two words on all search engines.  In 1997 when Searchking search engine was built, it was built to work within the limitations of the day.

By year 2000, Searchking was starting to show it’s age, but it remain viable.  One feature the Searchking search engine had was instant listing.  When you added a page, it went live instantly.  This would play a key role in 2001.

By 2001, “Mighty” Yahoo was still the king of search, Google was rapidly gaining popularity for it’s deep spidering and better search results.  The other major search engines were falling behind rapidly or had disappeared.  In those days, it took Google about a month to start listing a new site that had been submitted too it.  Getting a new site to rank in Google was yet another matter, you might be listed, but you might be on page 20 of Google’s SERPs.  Also, Google was still just a web search engine, it was not a news search engine. News stories maybe got in and ranked faster but it still took a week or two before it would appear on Google.

9-11 and the need for Instant News and Information

Then the 9-11 terrorist attacks on the Twin Towers in New Your City occurred  on September 11, 2001.  Everyone was stunned.  Over on the Searchking directory hosting side, we had a forum community of directory owners.  I think we all spent most of the 11th glued to our televisions and radios, but we were also hitting all the news websites for updates.  We started sharing the URL’s to news stories on the forums.  Google had been caught flat footed, for days there was little useful or current information on it, which they acknowledged and tried to correct, but that was slow.

Over at Searchking, CEO Bob Massa told us in the forums that they were monitoring people desperately searching Searchking for any kind of information about the attacks, survivors and relief efforts.  Because Searchking could list new pages instantly, he asked us to help: to dig out and add the URL’s to news articles, new web pages of survivors, relief news, defense news, background news, to Searchking to help those looking for information.  And the SK community responded, we were searching for any news we could find ourselves anyway, TV and radio announcers were reading our makeshift URL’s of survivor lists, and we could jot down and share those too.  This was most important in those first couple of days after the attack, but people from the UK, Canada, the US, even a person in Greenland, cranked out listings of information for about 2 weeks straight. Eventually Google reprogrammed their crawlers and started catching up as did the mainstream media.  All the fancy high tech crawlers failed, but little low tech Searchking actually delivered.

It was probably Searchking search engine’s last best moment.

Why is this Important Today?

There are lessons to be learned from this example that would help make a new directory viable today.

  1. The idea of a flat directory, without drilling down through a hierarchy of categories fits in with the way people use search in 2018.  Make a directory, look and act like a search engine to the searcher.
  2. Even if you have a hierarchy you can hide it on another page so you don’t scare people off.  Just present the searchbox front and center to the user.  (Personally, I would still have a hierarchy of categories as “spider food” for Google, Bing and Yandex, but just make the searchbox so prominent users won’t ignore it.)
  3. One might combine a directory with one of the many available open source web search engine scripts  in some way and combine human reviewed results with crawler results.
  4. Instant listing after a human review is still fast.
  5. Don’t rely on user voting for rankings. People want their information in as few clicks as possible.
  6. Allow for longer Titles, Descriptions and Keywords.  You can better capture what a site is about that way.  We kept these short because of dialup slowness, but that isn’t a problem anymore.

 

Interested in the directory hosting side of the old SK?  I will have more on that in a later post.

 

This was also posted to
/en/linking.

 

 

Liked this post? Follow this blog to get more. Follow

The un-celebrity president: Thirty-seven years after leaving office, Jimmy Carter shuns riches, lives modestly in his Georgia hometown

Like: The un-celebrity president: Jimmy Carter shuns riches, lives modestly in his Georgia hometown – The Washington Post

I was not a fan of his as President, but I respect the man.  He practices what he preaches.  In an age of phonies he’s the real deal.

Liked this post? Follow this blog to get more. Follow

During the summer of 1937, Britain was so full of spies that posters were put up in public areas warning people that “Silence is safety, Now more than ever – forget what you hear” and “Be like Dad – keep Mum.”  Although most people were alert, no one expected a group of clean-cut teenage boys to be guilty of spying.

Source: In 1937 Young Nazi “Spyclists” Traveled Around Britain for Reconnaissance

Liked this post? Follow this blog to get more. Follow