Best practice...json_encode OR serialize?

Am trying to store array information gathered from a form. The form is dynamic s the info can include a different number of fields and different names. As such it seems the best way to package the info is in an array or multidimensional array and then store that array into the DB.

There seems to be two ways to do this: serialize() or json_encode() I was wondering which one is be considered the best practice.

So for example the form could have any # of text inputs named detail/ paired with any number of selects named detailKind, it also has a variable set of single fields … named: samples
after the form is submitted encode the info as such:

$storeMe=array(‘detailKind’=>$_POST[‘detailKind’], ‘detail’=>$_POST[‘detail’]);
$putIntoDB1=serialize($storeMe); // OR $putIntoDB1=json_encode($storeMe);
$sampleStore=$_POST[‘samples’];
$putIntoDB2=serialize($sampleStore); // OR $putIntoDB1=json_encode($storeMe);

I like JSON because the data remains readable in it’s stored form; you can even edit the JSON text w/o corrupting the object! BUT multidimensional arrays get stored as objects and not arrays which means an extra checking and extraction step.

Serialize() keeps everything in array form, but is not as pretty ( or mailable ) after encoding.

Which methods would the forum experts favour?

Thanks again, as always your insights are greatly appreciated

If you’re going to ask me, I’d rather to use json_encode()

Thus far, I am leaning that way too.

do you know of a way to make json_encode/decode single multi dimension array like:array (array(),array()); as an array and an object? that would be perfect in my situation.

Seems to me that serializing a data structure in any way for storage in a DB is considered bad practice. It’s far better to set up your tables and relations to accommodate dynamic data. For example, details and samples each sound like a many-to-many relationship. That is, a record can be related to any number of samples, and a sample can be related to any number of records.

well the samples, for exmples, are image URLs. For the convenience of the user I coded it so that using .js a new sample text field would be created ( instead of entering a comma separated list) this means that there could be anywhere of 0-infinity associated URLs… so it was better to keep all of the data in 1 column.

I could have IMPLODED the string, but encoding or serializing seemed safer.

Yes the other option would be to create a table for samples ( with a linking columnID) … … which is ok . but Details has KINDS of dietails/ then sample/ and the data itself and then the actual data. That an additional 3 tables, so 4 tables to out put one string seems wrong.

If this is a one-off script that will have a short lifespan, then the quick-and-dirty serializing solution may be OK. But if this is for a larger application that may need to be updated over time with new requirements, then I strongly suggest you use a normalized DB schema.

I’d opt for json_encode, simply because that way you will get back what you inserted when you json_decode it. With serialize however:

Be aware that if useing serialize/unserialize in a serverfarm with both 32bit and 64bit servers you can get unexpected results.

Ex: if you serialize an integer with value of 2147483648 on a 64bit system and then unserialize it on a 32bit system you will get the value -2147483648 instead. This is because an integer on 32bit cannot be above 2147483647 so it wraps.

http://php.net/manual/en/function.unserialize.php#68748

:eek2:

Put down the serialize and step away slowly.

Thanks guys. A follow up question.

How come when you json_encode a SINGLE one dimensional array you get back a single multidimensional array, but when you encode a multi dimensional array you get an object?

N/M…

I realized I had used associative arrays for the multidimensional info… doh!

There seems to be simple solution for you, just use json_decode($json, true):

mixed json_decode ( string $json [, bool $assoc = false [, int $depth = 512 [, int $options = 0 ]]] )

[…]
assoc
When TRUE, returned objects will be converted into associative arrays.

This way you will always get back arrays instead of objects.

I think the point about large integer overflow is irrelevant here because the OP wants to serialize form fields - and form fields are always strings or arrays of strings. However, json is better for human readability.

It’s not so much about single vs multidimensional arrays but rather about numeric vs associative arrays. Because JSON is JavaScript Object Notation it must follow the rules of javascript data structures. In javascript there is no such thing as an associative array, arrays are very simple structures of elements with numeric keys 0, 1, 2, and so on. They can be multidimentional as well. So for example, this php array can be directly converted to JSON:


Array
(
    [0] => zero
    [1] => one
    [2] => Array
        (
            [0] => val0
            [1] => val1
        )

)

However, when the keys are not in straight numerical sequence starting at 0 or they are not numeric, then they cannot be represented in a JSON array, they have to be converted to objects. For example this php array:


Array
(
    [0] => zero
    [1] => one
    [5] => Array
        (
            [a] => val0
            [b] => val1
        )

)

will have to be converted to objects by json_encode:


stdClass Object
(
    [0] => zero
    [1] => one
    [5] => stdClass Object
        (
            [a] => val0
            [b] => val1
        )

)

And this is what you will get back from json_decode. However, simply use true as the second param to json_decode and you will get back arrays.

LJ, that was a point i overlooked (that is that non sequential , yet numeric, array strings are in essence associative) More specifically ,since from returns were numeric , ordered arrays (name, tittle, etc) … that when I tryied to join these into one (associative array…using the name of the original form variable) array then make use jsn_encode i got back objects.

jsn_decode ($json,true ); does the trick… I was just having one serious brain f@rt … thanks all