First time here? Check out the FAQ!
THIS IS A TEST INSTANCE. Feel free to ask and answer questions, but take care to avoid triggering too many notifications.
0

How to reduce the load while analysing big file?

  • retag add tags

Hello. I have 10 files of my packets captured. 1 file contains at least 2+ billion packets which cause big load for my system when I try to analyse file with endpoints, dialogues, etc. It can take up to 5-10 minutes to load it and then still lagging so much when I'm clicking "resolve names".

I think my pc not so bad, because it worked okay when I was analysing dump files with +-800k packets. So, are there any methods, maybe, how to divide analysis of 2 billion packets for 4 parts, for example?

neverxxsleep's avatar
1
neverxxsleep
asked 2021-06-20 19:09:00 +0000
edit flag offensive 0 remove flag close merge delete

Comments

add a comment see more comments

1 Answer

0

The Wireshark suite includes editcap that can be used to split a capture file into multiple parts.

There are options to split with a limit of the number of packets per file, -c or by interval (seconds per capture, -i.

There are also options to cut the length of each packet, -s, for example if you don't need traffic above the TCP layer.

grahamb's avatar
23.8k
grahamb
answered 2021-06-20 19:55:00 +0000
edit flag offensive 0 remove flag delete link

Comments

add a comment see more comments

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account. This space is reserved only for answers. If you would like to engage in a discussion, please instead post a comment under the question or an answer that you would like to discuss.

Add Answer