Storing User's IP addresses efficiently

I’m looking for suggestions on how to design a table to save user’s IP addresses. I will be saving the last 20-50 IP addresses used. I will not be storing duplicates (unless that is the most efficient way to go here), but I want to record how many times each IP address was used. (So if I check to see if an IP address already exists, and I see that it does, I might want to somehow indicate the number of times I have tried to update the database with this IP address for this user).

I will use a php script to insert the ip addresses and to check and see if I am inserting a duplicate one, but I am concerned about table design here and how I should store the data efficiently. Here are some concerns:
Currently I have a table with three columns. Column 1 for the id of a user, Column 2 for a single ip, and Column 3 for the date it was used. Each time a user logs in I check to see if that user’s current IP already exists as associated with the user in this table. But I am not sure how to expand the table further (and store the data efficiently) how many times a user has used a certain IP.

I am probably overlooking something basic here. Any thoughts?

At the end of the day I want a php script that displays data like this:

//
User Name

  1. IP Address 1 (used X many times) (last used)
  2. IP Address 2 (used X many times)
    //

You’ll also need to take into account that there are two IP addressing systems now in use, IPv4 and IPv6 which have different formats for how the IP address is displayed. Unless your dealing with an internal network you’ll get some users using IPv4 and some using IPv6

One option:

You could have a table with

userID
ipAddress
fequency
lastDateUsed

Then when a user logs in, run a query to see if the ip address currently used by the user exists in the table for that user. If it does, increment the frequencey value for that ip address by 1 and update the lastDateUsed. If the ip address does not exist then insert a new record in the table for that user’s current ip address.

running a SELECT query and then deciding whether you should use UPDATE or INSERT as a second query is inefficient (two queries)

just use INSERT ON DUPLICATE KEY UPDATE syntax (one query)

yep, that’s a better way :slight_smile:

Thanks for the suggestions. I checked out the INSERT ON DUPLICATE KEY UPDATE syntax but I am not sure I have an appropriate key here, as called for my the mysql manual (http://dev.mysql.com/doc/refman/5.0/en/insert-on-duplicate.html)

Here’s the syntax I came up with, which is likely wrong (I haven’t used ON DUPLICATE KEY UPDATE before):

INSERT INTO ips (userid, ipadress, lastusedtime) VALUES (‘{$userID}’, ‘{$currentIP}’, ‘{$lastUsedTime}’)
ON DUPLICATE KEY UPDATE frequency = frequency+1;

(I set the frequency column as not null and it’s default value to 1.)

The problem is that none of these rows are unique: the userID is used multiple times (one for each ip address), ipaddresses can be the same (people might log in from the same location), lastusedtime might be the same.

ALTER TABLE ips ADD UNIQUE ( userid , ipaddress )

it’s the combination that you want to be unique, and this composite key will trigger the ON DUPLICATE which is exactly what you want (“see if the ip address currently used by the user exists in the table for that user”)

I’m still having a bit of difficult implementing what you recommended. For a reason unknown to me, instead of incrementing the frequency by 1, it increments it by 2, when using the insert on duplicate syntax above.

Here is my show create table output, if it helps:

CREATE TABLE `ips` (

  `userid` varchar(45) NOT NULL DEFAULT '0',

  `ip` varchar(45) NOT NULL DEFAULT '0',

  `ip_date` varchar(45) DEFAULT NULL,

  `frequency` varchar(45) NOT NULL DEFAULT '1',

  PRIMARY KEY (`userid`,`ip`),

  UNIQUE KEY `userid_UNIQUE` (`userid`),

  UNIQUE KEY `ip_UNIQUE` (`ip`)

) ENGINE=InnoDB DEFAULT CHARSET=utf8

Any thoughts on why this might be the case?
I am using the same code I posted above, modified for my new column names.

could you take a moment and explain why you have those three unique indexes (counting the PK, which is also unique)

as it stands, each userid can exist in the table only once, which kind of goes against what you said in post #1, suggestions on how to design a table to save user’s IP addresses, because that means each user can only ever have one ip

also, since each ip can only exist in the table once, that means that no users can be on a shared ip (like through a company firewall)

That’s a big error on my part. No column should be unique. As you said earlier(or at least how I interpreted it), the userid/loginip combination should be unique. What I said in post 1 is really what I am after. I am just designing this table badly. :frowning: I thought I was implementing your “ALTER TABLE ips ADD UNIQUE ( userid , ipaddress )” but I was not.

I think I fixed it, but it may contain errors. I am not so great at mysql - your comments would be appreciated.

CREATE TABLE `ips` (

  `userid` varchar(45) NOT NULL DEFAULT '0',

  `ip` varchar(45) NOT NULL DEFAULT '0',

  `lastlogintime` varchar(45) DEFAULT NULL,

  `frequency` varchar(45) NOT NULL DEFAULT '1',

) ENGINE=InnoDB DEFAULT CHARSET=utf8

I ran the “ALTER TABLE ips ADD UNIQUE ( userid , ipaddress )” statement but it added a composite index, not a composite key.

key = index

:smiley:

:slight_smile: Okay, so that is out of the way. Turns out I am still getting 2 added to frequency each time my script is run. I only run this query once, and I also tested it in isolation as well. I am at a big loss here. :frowning:

Here’s my query again for the record:

$sql= “INSERT INTO ips (userid, ip, lastlogintime) VALUES (‘{$userid}’, ‘{$ip}’, ‘{$lastlogintime}’)
ON DUPLICATE KEY UPDATE frequency = frequency + 1”;

Perhaps there are two duplicate keys problems, and so it is updating frequency twice? That’s my best guess.

you tested it outside of php?

did you remove the extra indexes?

A long awaited update. I found my error: My error handling was executing the statement twice. I copied my error handling every time I ran a test to try to figure this problem out (so that it might display the error), but it was actually causing the error. :frowning:
Thanks for your help.

thanks for the update :slight_smile:

My suggestion was you should store the raw record as per visit regardless of the uniqueness of ip, userid or whatever. Once you have the raw data, the rest is up to you to crunch the needed information. The verification shall not in forefront as it may slow down your website once your record grow in size.