Re: A programming wiki?
- From: //\\\\o//\\\\annabee <For.Reasons@xxxxxxxxxxx>
- Date: Wed, 27 Dec 2006 10:42:03 +0100
På Wed, 27 Dec 2006 08:02:48 +0100, skrev Dragontamer <prtiglao@xxxxxxxxx>:
GRRR.. google groups... *mumble*
:) point in case? (technologywise)?
My point is this: unless this programming wiki will do better than
say, CVS, I don't see any point in making it.
I do not know CVS. But, tell you this.
if cvs is more work, then :
1. enter url, press enter
2. click click click, project
3. click click - Function SetRect
4. rewrite this routine
5. press compile.
If it takes more then 10-20 seconds to produce the new exefile, under CVS, then it is useless. (to me). I dont need no "versioning" control myself. When I code, I allready know that the last version is fadding out. So better delete it. If not, and I am just testing something, I just comment away the code that is there. All of this takes no time. Writing code takes usually no time, what takes time is to clearly understand the solution, and then to write it is allmost an insignificant afterjob. Not allways true, of course. Depends on the problem. For improvements and bugfixes, its certainly true in many cases.
what i mean is that, user1 creates project1, using version 1 of all routines.
user 2, change routine xx, so it becomes version 1.1. Now user1 still uses version 1 of everything, but user2 uses version 1 of everything and version 1,1 of this spesific routine. User1 can upgrade to version 1,1 if he wants to, by looking at the sourcetrees.
Already, there are tons
of tools avaliable that are standard in any modern developer team.
Add in some $$ from a company, and you can have standard
state-of-the-art technology (whatever that is for Version Control...
probably not CVS :-p)
State of the art, in programming tools, is what RosAsmers call laughing "stock". :) not to be "uber" arrogant or anything. Its just one of the strange facts of life.
Anyway, some issues that need to be resolved are:
-- Edit conflicts. How to "merge" code together when 2 people
work on the same file? The one on Wikipedia is not nearly robust
enough to resolve Edit Conflicts.
No conflickts need to exists. Each user uses a uniqe set of routines/files. If user2 is changing routine 103, and causes 102, and 133 to change as well, all those change without ever causing trouble for user1, who uses all the prior versions/and data of those routines. By inspecting the changes others made, user1 can upgrade his routines if he wants to. Or both upgrade and modify at the same cycle.
-- Forking. How is it done? How easily will it happen? How will
you organize it so you can fork experimental code out and
merge it back in later (after testing)?
Forking? You mean merging in new changes? Maybe there could be some temporary working versions, and then the creation of lasting new versions is done by setting a checkmark at the end of the day or week. At that time, all code that has changed, are stored in a new files, with a diffrent version number. Maybe voting could be involved. The system must take on to fully automate this. The user2 who changed the code, may or may not add comments telling other what he did, but if he like them to notice it, he may. Or the other users could just compile the new version, and see for themself if they want the added stuff or not.
Unused, or unmodified versions are moved to a relevant folder automatically, or marked as such so they wount cludder up and complicate things for the active projects, that have popularity and usefulness for instance.
For example : I make a program that scans some dir on your harddrive, and passes each buffers to an encryption routine. It can do all the work, so that the only thing you need to add is your own personal encryption routine. All the userfriendly interface thingy, which take up most of the work, and means less for the result, is done for you. You may have expertise in encryptation, and you add in the missing important piece. Will reduce your job to a few hours work, while another programmer who knows how to make efficient userinterfaces, which are lots of work, but not allways technically challenging, can work at that, without worrying about those parts he hasnt any expertise/interesst in.
-- Branches. Closely related to the above problem. How do
you handle the forks (aka, branches) later?
Each change, form a new independant version. This is mostly just textfiles, and it takes allmost no significant space. For 100000 modifications, this is a problem, but there could be ways to deal with that.
Also of course, avoiding the sure problem of 20 people compiling
all the same code on the same server instead of on their home
computers... mentioned in a different message.
compilation will happen locally. You just download the sourcefiles, of the version you adopted, in your browser, and the browser plugin compiles it for you. You dont have to burn back to the server, until you choose to, of course.
To be fair however; there would be some advantages...
* A contributor can read code, change it, and leave.
Yes. If he likes to.
Of course, the corresponding disadvantage is that...
* A malicious user can read code, change it, and leave.
Code is visible to all, so you read it. If it looks ok, and seems to perform better, or add features that you want, you upgrade. If you dont understand it, you dont risk loosing any data, as its all stored online anyway. And theres no forced upgrade.
The server will keep track of all the routines that changed in the last modification.
So there is some give-and-take here. Also, the fact
that a contributor needs to read/learn a LOT before he can
really contribute well removes a key advantage of a wiki.
Contributions from the "common man".
I guess in this case we call it the "common" programmer. Theres no reason why there can not be lightweight example projects in there, for beginners with tutorials and all. For each language. Also, new versions by better programmers to learn from. And btw, most of an application are "commonplace" anyway. Only the key parts are usually advanced stuff. E.g for the encryption program mentioned above, creating the userinterface is what takes time, while the encryption routine is what is important. But without a easy to use userinterface, the encryption programmer is likly to be amoung the few guys to benefit from it.
The only point of such a thing, would be to get rid of any download, install, upgrade cycle, and make your project accessible from any machine having a browser. Plus the ability to work in your fields of interesst, and to contibute where you want to. It must be a streamlined nonspamming efficient and intelligent service. A masterpiece of a work, enabling cooperation across the net, that dont waste your time. If it could be made to be more efficent then having all theese diffrent tools, which I belive it could.. then I cant see how it could become a miss in the GPL community at least.
I can see of course that creating such a thing would be a terrible challenge. But I think that if at all possible, and if succeeding at the userfriendlyness, and the efficiency of use, that imo _must_ be there, then it could become an unparallelled success. I can also see some problems with it, like 2 million monkeys making modifications without any basis.... So but this could be solved maybe by approving project maintainers, by intermediate versions, voting, testing etc. It perfectly possible to aprove people anonymously. You may upgrade rights by way of contributions. If user1002 has made no useful contribution, he is restricted to work on the less significant projects, and when his contribution rate has climbed over some amount, he gets a status, so that he can contribute where he wants to. If he keeps posting 200 modifications a day, then take care of him.... Some kind of peerevaluation or something. Another point is that such a system would centralize many useful applications, and be a resource for seeing whats out there and many other things. Just like sourceforge is today, allthough _much_ faster, better, intelligent.
We all do all of this allready. We read forums, here and there, what other programmers do. They post examples, they post links, we download a zipfiles and testout their code. We learn a little here, a little there. We all do this now. In addition, we download our compilers of choise, upgrade them, and we create a project, and we then upload it to our webspaces and post links on forums to make it public. I just think it could be more efficient to integrate it all into some multitool like this.
Several serious problems exists. How to convince the compiler authors to contribute. Without compilers, you cannot compile. How to make it efficient without drowning in useless contributions. Lots of problems to solve. But if they could be solved, I am pretty sure it could have something in it.
- Prev by Date: Re: A programming wiki?
- Next by Date: Re: X without libraries... still...
- Previous by thread: Re: A programming wiki?
- Next by thread: Re: A programming wiki?