Jump to content

Sambwise's Content

There have been 242 items by Sambwise (Search limited from 29-October 19)

By content type

See this member's

Sort by                Order  

#25631 How to delete something that's now excluded but wasn't before

Posted by Sambwise on 19 November 2019 - 10:47 PM in Streams

It feels like the solution here is to exclude them from the child stream (in whatever path they currently exist in there); if you delete them, odds are at some point you'll do a merge or copy that will propagate the delete to somewhere that you don't want it to go.  Since the issue here is that you want these files to exist, just not in this particular stream, you should exclude them via the stream paths rather than delete the files themselves.

#25629 Delete personal server

Posted by Sambwise on 19 November 2019 - 04:49 PM in P4V

I think the simplest way to clean up a personal server is to just delete the entire directory where it lives (i.e. the workspace root, which also has the personal server's .p4root nested inside of it).

#25618 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 18 November 2019 - 08:10 PM in P4V


However, all three (may) use batch-files of a different (common) workspace mapping onto the same local directories during some automated building process. (autoBuild_ws)

This sounds sketchy.  The fact that things go wrong after you run this script is a good indication that something's not working right with it.  Is there any possibility that the script is actually modifying persistent configuration state (e.g. it's running "p4 set" or otherwise modifying config files) so that the shared workspace badness is "leaking" into the user's environment after they run the script?

By "same local directories" do you mean that autoBuild_ws and A_ws share a local client root (or that one is a subset of the other)?  If so, that's inherently bad even if everything else is being managed well.  The script should not be using autoBuild_ws in that case, it should just inherit the environment it's executed in so that it uses the user's own workspace (that way Perforce can actually track the workspace state correctly and it'll be a lot harder to mess things up in confusing ways).

The only way I can see this shared autoBuild_ws thing working well is if it's always using its own dedicated root directory *and* the only command you ever run with it is p4 sync -p (which is basically a "stateless" sync operation) so that there's not any workspace state that's being shared between multiple client hosts.  In addition, the script should always be passing the workspace option statelessly (i.e. it should run "p4 -c autoBuild_ws sync -p", not "p4 set P4CLIENT=autoBuild_ws;p4 sync -p") so that there's no possibility of leakage.

#25615 Single depot, multiple projects- branching

Posted by Sambwise on 18 November 2019 - 03:31 PM in P4V

First -- if you can make your dev tools use relative paths instead of absolute paths, that will make life easier in general.  I always liked having all my branches on my filesystem (within a single workspace) and just being able to hop between them by changing directories.  :)  But you can definitely have multiple branches and map them all to one spot (just not all at the same time).

If you use streams, it works almost exactly like git:

p4 switch -c my-new-stream

Presto, you have a new stream, and it's mapped to your workspace in the same place as the stream you just branched it from.  When you p4 switch back to the original stream, your workspace gets switched around so that once again that stream is in the same spot.  (If you use P4V, I think this mostly works as well, although I remember there being some bugs around switching streams with old versions of P4V; that might work better now.)

If you aren't using streams, and you want to change which branch your workspace maps, you have to manage the workspace manually by modifying the View.  The basic idea is that when you're working in branch //depot/main your View maps //depot/main to your client, like this:

//depot/main/... //your-workspace-name/...

If you want to work in a different branch but map it to the same place in the workspace, you change the View:

//depot/dev/your-name/... //your-workspace-name/...

and the next time you sync the workspace, all the files will be replaced with the ones from the new branch.

#25608 partner exited unexpectedly !!..

Posted by Sambwise on 17 November 2019 - 07:24 AM in General

What P4PORT is your client using?  (p4 set P4PORT)  Is it trying to connect to this instance via a one-off rsh process or is it connecting via the port your p4d daemon is listening on?  Is there any error message in the server's log file?

#25602 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 15 November 2019 - 09:06 AM in P4V

I'm still pretty sure that you have some kind of workspace-sharing issue going on, since that's the only way I can conceive of one user's open file affecting another user's sync, but if you do have the ability to create workspaces then at least it's solvable.  :)  Did you take the step of locking the workspaces and password-protecting user accounts to make absolutely sure that they can't be shared accidentally?


Certainly the Person A, B, C of my example above had all separate workspaces at the time of trying.

If this were a situation that could ever be recreated, then I'd be very interested in seeing the output of person A's "sync" command (to see what error it returns, if any) as well as an "opened" command (to verify that there are no files open on that client).

#25599 Perforce package for Synology diskstation

Posted by Sambwise on 14 November 2019 - 09:44 PM in General

The official channel for Perforce support is support@perforce.com.  This forum is mostly for user discussion; Perforce staff post here sometimes but it doesn't have an SLA like the official support email does.

#25598 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 14 November 2019 - 09:37 PM in P4V

View PostBJS, on 14 November 2019 - 09:03 PM, said:

Can you elaborate on what you mean with "everyone share on client spec" ? (i.e. what do I need to check to find out.)
Note that I haven't set up the system in our company, I'm just using it. So I need to relate everything you tell me here to someone else.
This, plus the fact that nobody really has experience with Perforce can very well be the source of our problems :rolleyes:

Okay, I'm pretty sure this is the source of your problems, because everyone sharing one client spec will basically break every single Perforce workflow, in roughly the way that you've described.

In P4V a client spec is called a "workspace" (technically a client spec is the thing in the server that describes a workspace; P4V just calls them both the same thing).  Everyone using Perforce should have their own user name and their own workspace.  When you look at the list of users in P4V, you should see each person's name there, and when you look at the list of workspaces, you should see all of their machine names.  If someone sets up Perforce on a new machine and they pick an existing workspace instead of creating a new workspace, then when they run a "get latest", they're not going to get any files because as far as Perforce is concerned they already have the files.  Perforce tries to protect you from doing this by having a "Host" that each workspace is bound to, but it's possible to zero that field out so that a workspace can be used independently of the files that are in that workspace.  This usually leads to confusion and heartbreak.

Anyone who's in the situation of someone else using their workspace and messing them up (this includes your build machine) can fix this by doing the following:

1) Set a password on their user.  This makes sure nobody else can run commands as them.
2) Make sure the Owner of the workspace matches their user name.  If it doesn't, you are using someone else's workspace!  Make a new workspace for yourself*
3) Make sure the Host of the workspace matches their computer name.  If it doesn't, you are using another computer's workspace!  Make a new workspace for this computer*
4) Set the workspace to locked.  Now only the Owner can use that workspace.

* a new workspace will always start with the current user as the Owner and the current client computer's hostname as the Host

Each person who goes through that process should now find that everything works correctly (for them at least) because nobody else is able to mess up their workspace state.  You will be able to "get latest" without having to force a re-download of everything, you will be able to check out files and merge changes submitted by other users, et cetera.

If you don't have a Perforce license and you're consequently not able to create a workspace for each user, then I regret to tell you that you've exceeded the limits of the free version and should either pay for a license or uninstall it and use another tool.  You'd be better off just putting your files on a network filesystem (or GDrive or Dropbox or what have you) than trying to use Perforce in a configuration where everyone's sharing a single workspace.

#25592 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 13 November 2019 - 11:37 PM in P4V

View PostSambwise, on 13 November 2019 - 09:52 PM, said:

In any event, whatever it is that might be messing them up has nothing to do with Person B having the file open; that's just not how anything works.  :)

Waiiiiiiit.... you aren't doing something completely bananas like having everyone share one client spec, are you?

#25591 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 13 November 2019 - 09:52 PM in P4V


When Person A is now doing a "Get Latest", Perforce is not updating the file, despite it not being locally modified or checked-out. This completely breaks the workflow for Person A.

I assume this is a typo and you're talking about Person B, since you described this as a problem particular to people that *did* check the file out.

Person A should just get the latest version of the file whenever they "get latest", regardless of what Person B is doing.  If they don't, there's something else they're doing that you're not describing -- my guess would be that they're rolling back their local copy after syncing the latest version, or something like that.  In any event, whatever it is that might be messing them up has nothing to do with Person B having the file open; that's just not how anything works.  :)

When Person B (who DID open the file, which is the right thing to do) does a "get latest," they should get a message saying "must resolve (Person C's change)".  (In P4V, I think this is represented by a little red question mark next to the file or something like that.)  When they "resolve", that's what merges C's change into B's file (which is the exact thing you're asking how to do).

Give that a try and let me know how it goes.  If you're having a problem with not syncing down other people's changes, implementing a policy whereby you never check out your files is the exact opposite of the way to fix it.  :)

#25582 Problem with stream excludes and merging half of a move/delete pair.

Posted by Sambwise on 11 November 2019 - 09:48 PM in Streams

Using virtual streams is the standard way of having a "limited" view, yeah.  Usually a virtual stream is *less* overhead than a child stream with "share" paths because you don't need to merge every time the parent has a new change; it just syncs down automatically.

#25579 Problem with stream excludes and merging half of a move/delete pair.

Posted by Sambwise on 11 November 2019 - 08:29 PM in Streams


The bit I don't understand is how are excludes in child streams supposed to work given this? Or to put it another way, how can I set up my streams so that the children don't need to have everything in their parent but moves still "work"?

It's not at all possible for moves to work (i.e. remain atomic, which is the point of having a special "move" action) if you exclude half of the move.  Generally the idea is that if you "move" a file you're moving it within a self-contained module, such that its atomicity isn't going to get broken by future branching operations or view partitioning.  This gets loosely (and indirectly) enforced by the requirement that two parts of a move are always submitted in the same changelist, and the related requirement that a changelist can only ever include files from one stream (which in the simple case represents a single module).

If the move doesn't need to be atomic, you can do a "delete/"add" combination, such that rather than having a single file whose location changes, you have one file that gets deleted and another that gets added, with no dependencies between them.  This is much more permissive of the possibility that a child stream might end up with no file at all, or two of the "same" file, and it also makes merging between the two variants more difficult, but it does permit you to operate on either file in isolation and have it just behave like any other file, without regard to the other variant.

#25577 How to do an forced "merged - Get Latest" with files not checked-out?

Posted by Sambwise on 11 November 2019 - 03:53 PM in P4V

The way that you tell Perforce "I'm working on this file, make sure other changes get merged into it nicely instead of overwriting my work" is to check it out.  Why make life difficult by forbidding yourself from checking the file out?  :)

If you really want the file to not be checked out while retaining your local content, you can revert -k it at the end:

p4 reconcile ... # open the file(s)
p4 sync ... # get latest (scheduling merges)
p4 resolve -am ... # auto-merge
p4 revert -k ... # revert the files, keeping local content

but if you just check out the files that you're working on (i.e. the standard workflow), you don't need to do any of this and can just work normally.

It's technically possible to reverse-engineer everything that Perforce does automatically during the sync/resolve process without actually letting Perforce open/resolve the file: you could construct local copies of the merge inputs (which you can determine by looking at what you synced and what the other user submitted), run a merge algorithm over them, replace your local file with the result, and then delete the temp files, but at that point you're implementing so much logic yourself that you almost might as well just have the files on a network file system instead of using a version control tool whose job is to handle all of that for you.

#25574 Problem with stream excludes and merging half of a move/delete pair.

Posted by Sambwise on 11 November 2019 - 04:47 AM in Streams

The explanation in a nutshell of why this happens is that to avoid "evil twinning" (which is what happens in your second example, and is often considered undesirable), the two halves of a move are considered to be a single file with two paths, and the newer path (i.e. the move/add path) is the canonical identifier for that file.  To implement this, a file whose head revision is "move/delete" is ignored by a merge operation, and when a "move/add" revision is included as part of a merge range the resolve operation will include a filename resolve in order to account for the move/delete.  If the move/add revision isn't part of the merge, it's assumed that the corresponding move/delete has either been accounted for already or that it will be (e.g. if you were to cherry-pick around the move/add rev, you would not get the move/delete rev independently of it).

#25565 Using a Broker to route traffic based on depot paths?

Posted by Sambwise on 06 November 2019 - 05:54 PM in Administration

Redirecting every command in such a way as to make usage seamless seems like a lot more difficulty than it can possibly be worth.  I think the strategy you want to pursue is education-based rather than trying to hide the change -- make sure users who are looking for these depots know where to find them.  This could be as simple as putting empty depots in those spots with a README inside them.  If you want to get really fancy, maybe a broker filter on specific commands like "p4 fstat //old_depot/*" that will produce an error message (with the new server info) which will display as soon as a user tries to navigate to that depot?

View PostMiles O, on 06 November 2019 - 05:04 PM, said:

Why do you want to do split these depots out?

Actually, yes, this too.  If you're splitting them out for performance reasons but you want to hide that from the user, might something like commit/edge fit the bill?

#25548 Putting head revision into source files

Posted by Sambwise on 30 October 2019 - 03:20 PM in General

View PostGroover, on 30 October 2019 - 01:35 PM, said:

I tried p4 changes -m1 etc and this produces output that I could parse with a custom utility, but unless I am mistaken, I can't run this command line from a custom utility because I would need to pass login credentials every 12 hours and there isn't a command line switch to specify the password. Right?

No, scripting Perforce is pretty easy.  :)

If the command is running from the same environment where the user is submitting or working on their files, it'll just inherit their credentials (i.e. any script the user runs in their environment will just run as them, the same way it'll inherit filesystem permissions etc).

If you're running this as part of some centralized automation, the standard (reasonably secure) solution is to use a dedicated Perforce user account (historically Perforce would give you a free license to use for things like this, I'm not sure if that's still the case), and give that user an unlimited login timeout so you can just login once on that machine (which is presumably more secure than the average laptop -- the automation user can also be locked down to only have read-only access if that makes sense).

The standard INsecure solutions (which are a little easier but also aren't recommended) are:
  • don't set a password on that user (this is the hilariously insecure option, only works if the server doesn't mandate passwords)
  • do p4 -P PASSWORD changes -m1 (this works if you're on a server that doesn't mandate that you use login tickets)
  • do echo PASSWORD|p4 login (this works on a "secure" server, but is still terrible practice from a security standpoint)

(edit: wait -- how do you even do builds in the first place if you don't have a script that has permission to sync the files from Perforce?  :blink: )

#25544 Putting head revision into source files

Posted by Sambwise on 29 October 2019 - 07:12 PM in General

Here's a cheesy way that I've done this in the past:


I'd just make sure to update that file (and submit it) before I made a new build, and then the version number would automatically get compiled into the build via the keyword expansion.

If you want to write a script to run as part of your build/deployment rather than using keyword expansion, you can just fetch the change number directly from Perforce with the "p4 changes -m1" command.