Our Software Picks

NetBizCity - Our Software Picks


Saturday, 07 October 2006 03:32

Security - DNS or DNSLess

Security - DNS or DNSLess

Choosing which method to use depends on your hosting company.

First lets look at the typical structure of a hosting account.
Most hosts will have a setup that looks like the following:

                     logs \
                     ( where above mydomain.com is your public root - some hosts will name it differently)

You are expected to place your database in the database directory and your Web files (index.html, main.html etc) in the mydomain.com directory.
IIS which comes with Windows 2003 has defaulted allow parent directories to off.
This is a security issue since this can allow script to access all directories if not set properly.
Some hosting companies will not change the default settings which limits a DNSLess connection to using the Web Root directory.
If this is the case for your hosting company then DNS is the best way to go else you must place the database within you Web root.
The safest place to place files for maximum security is in your account root!

For a DNS connection you must setup an ODBC Data Source.
If your control panel allows this then you can set it up yourself else you must request it from your host support people.
strconn = "DSN=my-dataname"
Set dbc = Server.CreateObject("ADODB.Connection")
dbc.open strconn

For a DNSLess connection you only need to set the connection string to the file location.
s_db_filename = "\database\my-database-filename"
s_db_path = Server.MapPath(s_db_filename)
strconn = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & s_db_path
Set dbc = Server.CreateObject("ADODB.Connection")
dbc.open strconn

Vincent Gabriele

Saturday, 30 September 2006 06:27

IIS Configuration for Parent Path

IIS Configuration for Parent Path

Newer Versions of IIS have defaulted the setting for allowing referencing the parent to disable
In order to allow using ../ to reference the parent you must change the settings or
have your hosting company change the settings if this is the case

If Enable Parent Path is disabled you script will Error on lines like <# include Virtual="../includes/include_file.inc">

Using Internet Information Services management application choose properties to bring up the following screen

Click on the Configuration Button which will bring up the following screen

Make sure the Enable Parent Paths check Box is checked!

Vincent Gabriele

Tuesday, 06 July 2004 09:54

My web form has been Hijacked!

My web form has been Hijacked!

If this happened to you follow these steps to stop the Hijackers dead in their tracks.
First thing to do is determine which web forms you have.
Spammers have been successful at getting to any form within a website.
This includes forms within popular applications.
They do this by registering in your system so they have access to the applications web forms.
If you have installed a mail script chances are it's prone to abuse.
Almost all mail scripts do not check for any form of abuse.
Follow the steps outlined below to rid your self of this problem.

Web Form email scripts
A proper Web form email script must check for code and also the place the form was submitted from.
With common scripts on the market a spammer need only copy your form to his pc.
He can then add an automated script to flood your server with hundreds of thousands of emails.
The script we wrote checks to be sure the form was activated on your website.
It also checks for suspicious code that indicates he is trying to pass email through your system.
If you replace your script with ours or one that promises to block hijacking it will solve this problem.

Forms within Applications.
Many spammer will use the forms within applications such as popular Shopping Carts, CMS Apps and many others.
Chances are the culprit used a temporary free email address to signup and dropped it soon after.
The first thing to do is remove everyone from your list that has a bad email address.
Then contact the software company to find out when they will fix this and when an patch will be available.
If they have no plans to fix it soon you can either disable the form or contact us to place a patch on the code.

Keywords - Title - Description - Heading - Robots

Check your WEB Page!   Use our Meta Tag Generator!

These are the most important parts of your WEB Page!!!
To get the word out there you must pay close attention to each of these.
Keep in mind that you have to come up with two different plans that will cover the different ways that search engines use this information.

There are Search Engines that will use the Meta Tags for indexing while others will only use the Page Content.

This is what displays for your listing.
Usually it will be your company Name.
Example - <title>Welcome to Myrental.com</title>
You should have a Title of no more than 100 characters and it should contain at least one important keyword

You need not spend too much time with this tag
None of the Major Search Engines use them anymore so I would suggest using very few
Only smaller Search Engines use them and I would still use them but keep it down to five to eight phrases
You must think of anything a person might type into the search to find what you are selling.
The order is also important - place the keyword with the highest priority first!
The search engines will usually place emphasis on the first three or four words.
A good example will be:
<meta name="keywords" content="car,rental,cars,automobile,automobiles">
You can combine keyword also such as <meta name="keywords" content="car rental, auto rental , auto, car">
Of course if you rent trucks also then trucks will be the second or third keyword.
I also suggest you place your domain name here as the last item in the list.
This is more important for many of the smaller engines which, do not index the domain names, title or description!
It makes it easier for you to find out if your site is listed with the engine or directory!
You should have at least 5 to 8 keywords on your main page with less more targeted keywords on sub topic pages.

This is a short description of your Web Page.
Just fill in a quick short description of your page - it must be brief.
The search engines read this first to get a feel for your web page.

This will be the paragraph that describes your company and it's services.
A good sales pitch is recommended here.
<meta name="Description" content="Myrental - for the best deals on car and truck rentals.">
You should have a Description of around 200 characters and no less than 150.

This has become the replacement for keywords and Description for many search engines.
My suggestion is to come up with a heading that tries to combine the Meta Tag Keyword and Description content as best you can.
A good Example would be:
<h1>Myrental - for the best deals on car and truck rentals.</h1>
Of course your heading may be and should be a bit longer to fully describe you company.
Most important is to work those keywords in there.
Do not repeat the same keyword! The top engines may consider it Spam.
Once is enough - repeating the same keyword can hurt you!

Additional Important Tags
Distribution - Tells the engine if you want your listing Worldwide, local or IU (internal use)
Possible values - global, local or iu
<meta name="distribution" content="global">

Revisit-after - tells the Search Engine how often to check your site.
This is useful if you change you site often.
Replace X with number of days
<meta name="revisit-after" content="X days">

The Robot meta tag is for the purpose of telling the search engines not to index the page!
There may be some pages that you don't want them to include for one reason or another.
<meta name="robots" content="index,follow">
<meta name="robots" content="noindex,nofollow">
Since not all Search Engines use this you also have to have a file in your root directory called robots.txt
This one uses the following methods:

User-agent: *
Disallow: /robots.txt
Disallow: /images/
Disallow: /mydocs/
Disallow: /cart.asp

For more info on Robots check these sites: Robots Text    Google Webmaster Tips

Tuesday, 06 July 2004 22:00

Search Engine Submission Guide Facts

Search Engine Submission Guide Facts

Do any of these questions apply to you?
I submitted to xyz Search Engine and don't see my listing
My listing shows up but none of my keywords brings up my site
My title is clipped - I can't see the full title

These are pretty common complaints by many who have just submitted to the Major Search Engines

For these and other topics we will look into what to expect from the Major Search Engines.

First let's start with when to submit a site.
Many suggest you wait till the site is finished as to suggest you are developing a site on the Internet.
A pro does not develop a site in this manner
Development should be done on your PC or in a directory / Sub Web which is excluded in your robots text file.
Simple HTML can easily be created on your PC while more complicated applications will require a local Web Server
Windows supports PWS for older Windows versions and IIS for Win 2000 and XP, which are easy to use and setup
As soon as your pages are uploaded to your host you should submit!

Things to double check before you submit:
Keywords, Title and Description - make sure you have these right!
Double-check them using any Meta Tag Checker!
For your Title be sure to use a keyword phrase - this will get you main keyword listed.
The SE's will use your title first and you will have a searchable keyword within double quotes right away.
For instance - Best Used Car Deals - as a title can be found by searching "best used car deals"
Page Content - Be sure to use your most descriptive keyword within your content.

The Title is very important so spend time on it and don't worry about clipping.
Some SE's will clip a title over 60 chars long - so get the main message within the first 60 chars

Now we get to what to expect from the Engines.
Don't expect results over night even if you paid a fee to be listed.
It takes up to 3 weeks or more for your site to be listed on a free submission
Unless you went for Pay Per Click it will take up to 10 months for your keywords to surface!
This is an important fact - so if you constantly changing your keywords or Title you will prolonging the process!

Now as to the submission - follow the guidelines for each engine!
Google does not want you to submit second level pages!
What they do if you don't follow the guidelines is unknown
My rule is simple - don't upset someone who is helping you!

Tuesday, 06 July 2004 19:54

What Search Engines are Looking For

What Search Engines are Looking For

I sent out an email to a few of the top Search Engines with a number of questions.
I wanted to clear up just what was important to Webmasters on topics like Meta Tags and the like.

The questions were sent to Google, Inktomi, Altavista and Alltheweb.
These Engines supply the Majority of all Search Engine content on the Web.

What I came up with is keywords are out!
Reason given - keywords tag: not used due to excessive abuse by spammers
Although they are still used by minor search engines all the majors seemed to have dropped them.
So if you want to still use them you can limit the number of keywords down to five or so
This will be about right for most smaller engines which have limits on the keyword string size.

This brings us to the three Item Rule:
Title, Description and Content

It appears that the Title and Page Content are the two most important items which they base your ranking.
The Description is also very important but it is not really used by Google.
To have a page that ranks well in all engines then the above is true.

It is hard to get solid information as to how each Search Engine ranks a page.
It is clear to me they do not want to give out information to avoid spamming problems!
This is understandable due to the fact that many would abuse this information as they have in the past.
So how an engine ranks a page is kept secret and done so to avoid spamming problems.
It is also clear to me that Search Engines Rank pages based on what they believe users want!
A Search Engine will rank pages high if they believe the content is of value to those using the Engine

This brings us back to the three main items - Title, Description and content
The Title and Description must be consistent.
A Search Engine does not care about how well a page looks!
This is due to the fact that they use a Robot to look at your page.
The Robot will look at Links and written text.

Search Engines look at page links for possible keywords but a page Title rules in this area with Description and content coming in second
Also if your page contains written text consistent with the Title and Description
and is considered of value then it will rank high
There are other factors involved such as category and number of sites within that category that can effect ranking.

Many are trying to drive home a point that linking is the key to success.
Linking should help but I do not believe it is a major factor.
Quality links to high traffic sites help to drive traffic to your site and do help in ranking.
Quality links are links that are on topic.
If you sell Cars then other Car related Websites are considered on topic and will count as a plus
On the other hand some links will hurt your ranking!
Link farms, Link Exchange Programs or non-related Websites are not the way to go and will count as a minus.

But the real bottom line is being able to present a site to the engines that adds to the search experience.
The goal each engine is trying to reach is being able to do this without human intervention.

I do not believe anyone at any of the Major Search Engines actually looks at your page.
They may however do so if it reaches a top 10 ranking or if they have a category or topic that is important to fill.
Ranking from my experience is based on rules in a script that each engine uses to sort out the millions of pages they store.
These rules will vary from Engine to Engine but if you follow the three items rule along with quality links you should do well.

The other Meta Tags used by the Majors include:
index/follow tags
cache tag
refresh tag

I would like to thank the AltaVista, Overture and Alltheweb Support Teams for their help on this topic

Tuesday, 17 August 2004 18:11

Dealing with file or Site inclusion

File Inclusion

The first three methods only allow local files to be included

Include method:
You can include files in sub folders for example file="include\include_example.htm"
Or you can include files in the parent folder or other parent sub folders for example:
Other subfolder: file="..\include\include_example.htm"
Parent folder: file="..\include_example.htm"
IIS Server Tip for referencing parent directory

For windows you must use an "asp" or shtml extension.
If you use shtml then you must have the include file within a directory that has execute set - cgi-bin will work.

Example File=

<!-- #include file="include_example.htm"-->

Example Virtual=

<!-- #include Virtual="include/include_example.htm"-->

ASP server.execute method:
<%server.execute "include_example.htm"%>

PHP include method
include "include/include_example.htm";
include ("include/include_example.htm");

These methods allow inclusion of files located on another server!

PHP under linux can allow file inclusion of files from other websites if allow_url_fopen is enabled.
This setting does not work in Windows as it is always disabled.

<iframe scrolling="no" border="0" frameborder="0" width="200" height="150" src="include_example.htm" mce_src="include_example.htm">
Your Browser does not support IFrames</iframe>

Object method:
<object style="border: 0.5cm groove orange" data="include_example.htm" type="text/html" height="130" width="250">

Screen Scrape with ASP .NET:
This method requires you to save the file with an ASPX extension.
<%@ Import Namespace="System.Net" %>

<script language="VB" runat="server">
    Sub Page_Load(sender as Object, e as EventArgs)
         'Create a WebClient instance
       Dim objWebClient as New WebClient()
       Const strURL as String = "http://www.httpsend.com"    
       Dim objUTF8 as New UTF8Encoding()
       lblHTMLOutput.Text = objUTF8.GetString(objWebClient.DownloadData(strURL))
       Catch webEx As WebException 'to avoid dot net system error'
           if webEx.Status = WebExceptionStatus.ConnectFailure then
                'do nothing
             end if
         End Try
      End Sub
<table width="100%" bgcolor="white">
            <td align="center">
                  <h1>Screen Scrape Example of www.httpsend.com</h1>
      <asp:label id="lblHTMLOutput" runat="server" /> 

Vincent Gabriele

Page 8 of 8