We recently upgraded p4d to 2018.2 and I'm finding that the sync to ...#have (such as the one that p4v wants to run when changing stream views) is eating 100% cpu on the server side and preventing other users from being able to do anything.
I don't recall it being this expensive before. Is this a known bug? Anyone else experiencing this?
P4D/LINUX26X86_64/2018.2/1740258 (2018/12/11)


sync to ...#have really expensive in 2018.2?
Started by svrjkimura, Jan 15 2019 01:10 AM
2 replies to this topic
#1
Posted 15 January 2019 - 01:10 AM
#2
Posted 26 January 2019 - 03:11 AM
We have not seen this reported, but the 100% CPU is key. Try to sync in smaller numbers of files so the CPU is not at 100% and your rate should be faster. Also try parallel sync and autotune turned on. See the KB articles.
#3
Posted 26 January 2019 - 04:35 AM
Being CPU-bound on a sync operation (as opposed to I/O bound) is pretty weird.
If you go to the command line and use "p4 switch" to change streams, is it similarly slow?
What if you do "p4 -Ztrack sync #have" from the command line? Any clues in that output as to what it's spending all that time doing?
Do you have lots of embedded wildcards in your protection table AND have you modified the map.join tunables? The only part of a sync operation that would have strong potential to go CPU-bound would be in the mapping calculations (which are usually capped to keep you from crashing your server, but if you mess with the caps then bad things can happen).
If you go to the command line and use "p4 switch" to change streams, is it similarly slow?
What if you do "p4 -Ztrack sync #have" from the command line? Any clues in that output as to what it's spending all that time doing?
Do you have lots of embedded wildcards in your protection table AND have you modified the map.join tunables? The only part of a sync operation that would have strong potential to go CPU-bound would be in the mapping calculations (which are usually capped to keep you from crashing your server, but if you mess with the caps then bad things can happen).
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users