i am looking for a source of huge data sets to test some graph algrothm implemention. The files should be in an easy to read file format somthing like:
$Node1
Node23
Node322334
Node43432
开发者_StackOverflow中文版$Node2:
Node232
...
Thanks,
Chris
A quick python hack:
def generateGraph(n=100, avgNeigbors=10):
from random import randint
for i in range(n):
print "$"+str(i)
for m in range(avgNeigbors-randint(-avgNeigbors/2,avgNeigbors/2)):
print (randint(0,n))
I found this which may or may not contain what you need:
http://people.sc.fsu.edu/~jburkardt/datasets/graffiti/graffiti.html
http://people.sc.fsu.edu/~jburkardt/datasets/sgb/sgb.html
If you repost your question at https://math.stackexchange.com/ or at https://cstheory.stackexchange.com/ you may attract the attention of algorithmic graph theorists or computer scientists specialising in graph algorithms.
Do post a link here if you repost your question as I'm slightly interested in where to obtain such dataset. Thanks.
Have you considered using Facebook's Graph API? It provides data in a JSON format, so it is very easy to read and should provide some large graphs depending on which data you query for.
IMDB's dataset can be used for free (non-commercially!) which can be downloaded in flat text files. It's huge: 100's of megabytes of raw text you can build a graph of.
精彩评论