Jump to content


Member Since 12 Mar 2014
Offline Last Active Oct 27 2020 08:12 AM

Posts I've Made

In Topic: Set character encoding is not working in P4Merge

22 October 2020 - 06:21 AM

View Postp4crenck, on 22 October 2020 - 01:12 AM, said:

Hi, are you launching P4Merge from P4V? Also, make sure you are running the latest versions of P4V and P4Merge.

yes, I diff a file in P4V, and the version is Rev. P4V/NTX64/2020.2/2013107

FYI, I have updated to Rev. P4V/NTX64/2020.2/2028073, but the problem is still there.

In Topic: Set character encoding is not working in P4Merge

20 October 2020 - 06:14 AM

View Postp4crenck, on 19 October 2020 - 08:24 PM, said:

The encoding defined in the P4Merge Preferences is the one used when you launch it. You can then change the encoding for the current session by selecting the menu File->Character Encoding.

If you're launching P4Merge from P4V, P4V passes its current encoding to P4Merge, overriding its preferences.
The only workaround would be to define P4Merge as a 3rd party Diff/Merge tool in P4V Preferences.  Then the P4Merge encoding preference you set will be used.
Hope this helps

Hi, thanks for the reply, we have tried to set in p4merge preferences, but it looks like not working.

When I start P4Merge by diff files, the encoding is "System" by default, but I already set it to "utf8 no bom" in the preference (our system encoding is not utf8)
Attached File  1234.png   21.93K   7 downloads

In Topic: Can I break the creating checkpoint process

20 July 2020 - 03:44 AM

View PostSambwise, on 18 July 2020 - 02:57 PM, said:

Yes, the size of the db files is a much better gauge for how long the checkpoint will take!

If you want to try to gauge how close the checkpoint is to completion, you can peek at the checkpoint file and see which tables have been dumped so far.  The tables will be checkpointed in the locking order (documented here: https://www.perforce...current/schema/) and the number of rows in each table will roughly correspond to their size on disk.  Typically db.have, db.rev, and db.integed are the largest ones.

thanks a lot, you answer is very helpful!

In Topic: Can I break the creating checkpoint process

18 July 2020 - 08:06 AM

View PostSambwise, on 18 July 2020 - 07:12 AM, said:

The time to create a checkpoint isn't necessarily a function of the size of the journal file; most of what's in the journal won't end up in the checkpoint (e.g. db.have entries that have since been overwritten).

Since a checkpoint is primarily a read operation, it should be safe to kill it without damaging the db, but the checkpoint it's writing will be incomplete and is not suitable for recovery, and the journal will not be rotated.  The next time you take a checkpoint it'll start over from the beginning.

thank you! so it's safety to break the process, that's a good news!
from your answer, can I say that size of db files will affect the time spent on creating checkpoint? The db files save records from journal, and the checkpoint read records from db files, is this the workflow?

In Topic: What's the suggested P4API for developing custom tool on MacOS

18 December 2019 - 08:07 AM

View PostSambwise, on 17 December 2019 - 04:06 PM, said:

It depends on the tool.  If it's running a simple p4 command, you don't need any API beyond the CLI tool.  If it's something more complex but nothing graphical, I'd write a Python script using P4Python.  If it's a graphical tool where I needed to use Cocoa, I'd write it in Obj C (ugh) and use the C++ API.

thanks a lot, I will develop a graphical tool with you suggestion