什么是死链接

2022-08-01 00:30:10   第一文档网     [ 字体: ] [ 阅读: ] [ 文档下载 ]
说明:文章内容仅供预览,部分内容可能不全。下载后的文档,内容与下面显示的完全一致。下载之前请确认下面内容是否您想要的,是否完整无缺。下载word有问题请添加QQ:admin处理,感谢您的支持与谅解。点击这里给我发消息

#第一文档网# 导语】以下是®第一文档网的小编为您整理的《什么是死链接》,欢迎阅读!
链接,什么

什么是死链接?What is a dead link?

(2012-04-11 10:12:54)

标签: seo优化 it





分类: 网站优化

In fact dead link is just in this, that no utility links, spiders can't crawl links. This link for the website construction not only no use, it will affect the work site optimization. Just think spider index of the content on the website, found that many links can not arrive, virtually gave spiders left not friendly impression, the optimization of the later work will increase the difficulty.

其实死链接就是刚才笔者介绍的那样,没有效用的链接,蜘蛛无法爬行的链接。这种链接对于网站建设不但没有用处,反而会影响到网站优化工作。试想蜘蛛索引网站内容的时候,发现很多链接都无法到达,无形中就给蜘蛛留下了不友好的印象,以后的优化工作也就增加了难度。 它给网站会带来哪些负面的影响? It will bring to a website which negatively?

1.首先会导致网站收录量减少,排名下降,因为很多链接都是无效的,所以蜘蛛在抓取内容的时候,经常是扑个空,所以网站收录量必然下降。

1. The first can lead to web sites included to decrease, ranking drop, because many links are invalid, so the spider in grasping the content, often is flapping empty, so the website will decline by quantity.

2.流失PR值。大家都知道网站PR值是传递的,虽然如今谷歌退出了中国市场,但是PR值不是没有任何用处,在一些论坛上,PR值仍然是站长衡量链接质量的重要指标,但是死链接的出现必然会让网站在传递PR的时候出现断层,就拿我的网站198流量交换联盟来说,如果网站中充斥着大量的死链接,那么即使你采用DIV+CSS程序去引诱蜘蛛爬行,效果也是不明显的,因为死链接的大范围出现。会让网站结构更加裸露,更加难以爬行。

2. The PR value loss. We all know the website the PR value is to transfer, although now Google out of the Chinese market, but the PR value is not without any use, in some BBS, the PR value is still webmaster of important index measure link quality, but the appearance of the dead link must have the site in the transfer of the PR value when "fault" appears, he took my site 198 flow exchange alliance who, if a web site has a large number of dead links, so even if you use the DIV + CSS program to "lure" spider crawling, the effect is not obvious, as dead link range of appear. Will make a web site structure more bare, more difficult to crawl.

3.损失网站的用户体验度。试想一个陌生的游客来到自己的网站访问,如果打开10个网站中出现9个无法访问的,这会让用户怎么想自己的网站?肯定是业余加粗糙的结合体!而且我想告诫大家的是这部分流量的流失都是不可挽回的!

3. Loss of website user experience degrees. Just imagine a strange visitors to your web site visit, if open 10 website appeared in nine unable to visit, it will let users how to think my own web site? Sure is the combination of the amateur and rough! And I want to warn everybody is this part of the loss of flow is irreversible. 怎么检测网站中存在的死链接?


How to test the death of existing in the website links?

1.使用sitemap可以在生成网站地图的同时,把自己网站中死链接全部陈列出来。

1. Sitemap can use in generating the site map at the same time your own web site links all died on displays.

2.一样的道理,谷歌管理员工具也具有这样的功效,在数据栏目中,我们都能查询到谷歌没有抓取到的全部链接。

2. The same way, Google administrator with this kind of tool also effect, in data section, we can all inquires Google to grab all the links to no. 死链接怎么处理? Dead link how processing? 1.手动删除死链接 1. The manual delete dead links

看病最重要的就是找到病根,这样才能对症下药,缩短病人患病的时间,网站建设也是这样,处理死链接的时候我们最好找到死链接自身所处的位置,然后在页面中手动删除,这样比较容易,但是一旦网站上线时间较长,域名使用很久的情况下,虽然我们也可以使用站长工具查出死链接存在的数量,但是删除就不是很容易了,会相当困难,所以我们在建设网站的时候一定要做好防范,杜绝一切死链接的存在,一旦发现,立即删除。 The most important is to find a doctor of the trouble, like this can suit the remedy to the case, shorten the patient sick time, website construction also is such, handling dead links we'd better find dead link their own's position, then the page manually delete, it is relatively easy, but once the website online over a long period of time, domain name use long, although we can also use adsense tools found the number of dead link exists, but it is not very easy to delete the, it can be quite difficult, so we in the construction site must do prevent, and eliminate all dead link of existence, once discovered, delete immediately.

2.使用robots屏蔽可能的网页 2. Use robots shielding the possible page

同样以看医生为例,有时候我们会找不到病人的病因,如同网站的死链接,那么这个时候我们难道就不用处理网站死链接了?肯定不是,既然找不到网站的死链接,那么就扩大处理的范围,使用robots屏蔽可能的网页以及链接,编辑方法也很简单,只需要在语言中加入Disallow:死链接绝对网址这个语句就可以了,告诉蜘蛛这个网页无效,禁止抓去网站链接。然后把自己编辑好的语句上传到网站目录就可以了。

Same in order to see the doctor, for example, sometimes we can't find the cause of the patient, like the death of the web site links, so the time when we don't need not deal with website links to death? Certainly not, since we cannot find the death of the web site links, then expand the scope of the treatment, the use of robots shielding the possible page and links, editorial method is also very simple, need to be in the language to Disallow: dead link absolute site this statement, tell the spider this web page is invalid, banned grasp to go to your links. And an edited statements to the web directory. PS:但是这里面还存在一个问题,那就是如果网站自身经过改版造成大量死链接的存在,那么这种方法还有效用吗?经过笔者的亲身实践,发现如果网站改版后,内容以及网站结构层次上必然会有一个较大的变更,这时候单纯的使用robots语句对网站页面进行简单屏蔽的话效果并不是很明显,这时候我们需要利用301重定向,将各种网络请求直接转换到其他页面。


PS: but there still exist a problem, that is if a web site itself after revision caused a lot of dead link of existence, so this method and utility? After the author's personal practice, found that if after the website version, the contents and website structure level will have a bigger change, at this time the simple use of robots statements web page simple shielding effect and word is not obvious, at that time we need to use a 301 redirection, all sorts of network request directly convert to other pages.

3.401错误页面提醒 3.401 error page remind

如果你的网站上线时间较长,页面收录量是十万级的,使用301重定向方法就显得有些费时间,如果站长想减轻点负担,404错误页面或许是一个不错的选择,我们可以通过404错误页面将用户指引到我们想要他们访问的网站,跳转时间在8秒左右最宜,然后在页面中加入大家想要的网址提示,不过最好是用户自己点击,这样会降低自动跳转带来的用户反感度。但是404错误页面只是将在无法访问的情况下出现的提示,个人还是建议大家使用301重定向,将用户请求直接转换。




本文来源:https://www.dywdw.cn/c4a28d23f242336c1eb95ede.html

相关推荐
推荐阅读