public marks

PUBLIC MARKS with tags site & xml

This year

2008

Use Server Cache Control to Improve Performance - apache web server settings for optimized caching with configuration files

by camel & 3 others
Caching is the temporary storage of frequently accessed data in higher speed media (typically SRAM or RAM) for more efficient retrieval. Web caching stores frequently used objects closer to the client through browser, proxy, or server caches. By storing "fresh" objects closer to your users, you avoid round trips to the origin server, reducing bandwidth consumption, server load, and most importantly, latency. This article shows how to configure your Apache server for more efficient caching to save bandwidth and improve performance. Caching is not just for static sites, even dynamic sites can benefit from caching. Graphics and multimedia typically don't change as frequently as (X)HTML files. Graphics that seldom change like logos, headers, and navigation can be given longer expiration times while resources that change more frequently like XHTML and XML files can be given shorter expiration times. By designing your site with caching in mind, you can target different classes of resources to give them different expiration times with only a few lines of code.

2006

sitemaps.org - Home

by camel & 4 others
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site. Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.

So What is "Feed to JavaScript"?

by bcpbcp & 17 others (via)
An RSS Feed is a dynamically generated summary (in XML format) of information or news published on other web sites- so when the published RSS changes, your web site will be automatically changed too.

PUBLIC TAGS related to tag site

association +   blog +   france +   information +   le carrou +   medecine +   news +   plateforme +   projet +   santé +   science +   thomas le carrou +   web +   world +  

Active users

tyteu
last mark : 27/11/2024 13:54

moby
last mark : 11/11/2024 07:51

camel
last mark : 26/03/2008 16:59

bcpbcp
last mark : 04/03/2006 23:18

harshadoak
last mark : 07/02/2006 10:43