2013|10|11|12|
2014|01|02|03|04|05|06|07|08|09|10|11|12|
2015|01|02|03|04|05|06|07|08|09|10|11|12|
2016|01|02|03|04|05|06|07|08|09|10|11|12|
2017|01|02|03|04|05|06|07|08|09|10|11|12|
2018|01|02|03|04|05|06|07|08|09|10|11|12|
2019|01|02|03|04|05|06|07|08|09|10|11|12|
2020|01|02|03|04|05|06|07|08|09|10|11|12|
2021|01|02|03|04|05|06|07|08|09|10|11|12|
2022|01|02|03|04|05|06|07|08|09|10|11|12|
2023|01|02|03|04|05|06|07|08|09|10|11|12|
2024|01|02|03|04|05|

2014-06-06 Anyway, the complications of “big data” are beyond my description. [長年日記]

Thought I don't like the word "big data", I have to deal with the "big data".

From the viewpoint of this task, I want to define the "big data".

"Data, that stop any process, Excel, for example, by the shortage of memory

"Data, whenever I try to edit the data items, (same as the above)"

In short, I really feel that “big data" should be defined that "everybody cannot do anything in a deadlock.""

I think that someone wonder if they should process "big data" itself.

Yes, they should.

Thought there might be normal operations, as follows

Changing date format, deleting odd data value, and transferring item name, it is absolutely very hard, even if it is "bid data”.

"Untouchables by normal methods"

This is my "big data" definition.

-----

I have to write the data transformation program in order to change "140214" to ""2014/02/14", and to delete dust data,

In addition, the program cannot operate as I expect for the first time, so I should debug the program several times.

If the worst happens, the program cannot perform because of the "big data".

Anyway, the complications of “big data” are beyond my description.

-----

Fortunately, I could find the special editor that can grab more than more that 100,000,000 lines data

The name is “EmEditor”.

Thought I don't know that who made it and what purpose he/she made it, I was really helped anyway.

The problem however, is not fully resolved.

I have to resolve some remaining issues, for example, finding target values in more than 100,000,000 lines data.

"Breaking my heart before starting the analysis" is another definition of "big data".

-----

Now I am wondering that

What ratio of persons who use the word "big data", and

How they could reach to understand "desperate handling difficulty" of "big data"

(To be continued)