SitePoint Sponsor

User Tag List

Results 1 to 4 of 4
  1. #1
    SitePoint Addict
    Join Date
    Nov 2004
    Location
    New Jersey
    Posts
    317
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)

    URL-safe string compression functions?

    I'm looking to use a compression algorithm to store the target URL of a redirection script.

    e.g., click.php?url=(compressed string)

    Are there any compression functions that would give URL-friendly output? i.e. Output without any dangerous special characters; output that could be passed without fear of corruption when decompressing the $_GET value.

    (and preferably functions that are available in PHP 4.3.1)

  2. #2
    SitePoint Enthusiast onion2k's Avatar
    Join Date
    Dec 2005
    Location
    UK
    Posts
    83
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    Use a standard compression function, and then base64_encode() the resulting string.

  3. #3
    SitePoint Addict
    Join Date
    Nov 2004
    Location
    New Jersey
    Posts
    317
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    A very good idea, thanks

    Only problem is that on the long URLs I tested, it seems that the base64encoded version of a compressed string is the same length as the base64encoded version of an uncompressed string; in other words, the base64encoding undoes the compression (in addition to increasing the string size by the standard 33%). And on shorter URLs, using base64 on a compressed string actually gives a longer string than using base64 on an uncompressed string. So it seems better to use base64_encode() without compression.

    The compression functions I tried with base64_encode were gzcompress and gzdeflate; perhaps there are other functions better suited for this task.

  4. #4
    SitePoint Wizard dreamscape's Avatar
    Join Date
    Aug 2005
    Posts
    1,080
    Mentioned
    0 Post(s)
    Tagged
    0 Thread(s)
    why not just use urlencode() or rawurlencode()... that's what they're meant for.

    I doubt you'll find any compression that can make a url take up less space, just because it is already so small, byte wise. Most compression algos are meant to make larger files smaller, but really don't make files that are already small much smaller (a little but not much). And a URL is pretty tiny already, so it is no surprise that gzipping a string might make it larger than the uncompressed counterpart.

    If you're just after a shorter URL, you could store them in a database or text file and have the urls be like click.php?url=1 where 1 is the ID of the url.


Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •