summaryrefslogtreecommitdiffstats
path: root/evaluatingfosscontributions.tex
diff options
context:
space:
mode:
authorRajeesh K Nambiar <rajeeshknambiar@gmail.com>2013-03-25 19:35:32 +0100
committerRajeesh K Nambiar <rajeeshknambiar@gmail.com>2013-03-25 19:35:32 +0100
commit9cbf72c960653cf2e3a8fc30d69c3c61ebc0e843 (patch)
tree17f9a301f380d99efb28aedf533cabdd20d17786 /evaluatingfosscontributions.tex
parent2ca08f1db261ee4aaf703c721ad47649a0dd3a09 (diff)
downloadlogbook-of-an-observer-9cbf72c960653cf2e3a8fc30d69c3c61ebc0e843.tar.gz
logbook-of-an-observer-9cbf72c960653cf2e3a8fc30d69c3c61ebc0e843.tar.xz
logbook-of-an-observer-9cbf72c960653cf2e3a8fc30d69c3c61ebc0e843.zip
Final proof reading by Hussain K.H, English proofread by Rajeesh
Diffstat (limited to 'evaluatingfosscontributions.tex')
-rw-r--r--evaluatingfosscontributions.tex36
1 files changed, 17 insertions, 19 deletions
diff --git a/evaluatingfosscontributions.tex b/evaluatingfosscontributions.tex
index 9d6e84a..3ae0407 100644
--- a/evaluatingfosscontributions.tex
+++ b/evaluatingfosscontributions.tex
@@ -14,21 +14,20 @@ theoretical foundation for the papers. The interesting thing is, many FOSS
projects cite papers published (and some regularly publish papers) in main
stream journals to get acknowledged for their novelty and to ensure the novelty of
the algorithms they use. Many times author's notes are used in FOSS projects to
-report on real time usage, easier adaptation etc. (many plugins in GIMP are
-developments of European university PhDs. A famous one is, a resynthesizer plugin).
+report on real time usage, easier adaptation, etc. Many plugins in GIMP are
+developments of European university PhDs. A famous one is, a resynthesizer plugin.
However, the idea of evaluating the novelty factor solely considering the contribution
to the project requires a new metric to evaluate itself too. The normal factors
-required to assert novelty is all present in a collaborative project very
+required to assert novelty in a collaborative project are very
much in line with the normal lab, professors, conferences and peer reviews mechanism.
-The only difference and a crucial missing factor will be a published (I would
-say inaccessible too, a paper costs 5-10 USD depending on the publisher,
-conference and journal) covering paper. Peer review of the technique,
-its implementation in real time projects, required documentations as a
-part of the project documentation and various blog and log entries are
-available.
+The only difference and a crucial missing factor will be a published paper. I would
+say inaccessible too, since a paper costs 5-10 USD depending on the publisher,
+conference and journal. Peer review of the technique,
+its implementation in real time projects, and various blogs and log entries are
+available for documentation.
-Peer review of the subject experts happen very well in discussions
+Peer reviews of the subject experts happen very well in discussions
over IRC and mailing lists (most of which are archived). Different perspectives
from theoretical foundations to practical implementation issues are discussed
in a single go there (depends on the project too). But this varies from project
@@ -43,7 +42,7 @@ Another thing that we can draw parallel is the criterion used
by conferences and journals for accepting a paper and the peer review system
of the projects. These are some ideas that got into my mind, when thinking about a
systematic evaluation metric for novelty in FOSS contribution. A metric
-and a system like this will help to counter so many software patents too I guess.
+and a system like this will help to counter so many software patents too, I guess.
There is Special Interest Group of ACM for Computer Science Education.
They have a special section on FOSS. I haven't seen the proceedings and as such, I
don't know what all they discussed. But it will be good to check these
@@ -58,7 +57,7 @@ in academia of Indian Language Research though his works are very popular).
I believe what ever I wrote was taking it apriori that acceptance of ideas
to a project is enough for validation. My problem was how do we evaluate
-the novelty factor (we know there is a novelty factor, but how to scale it).
+the novelty factor. (we know there is a novelty factor, but how to scale it.)
Then later on turning this novelty factor itself to rate the projects.
Now projects interact with academia in a weird way. We should find a middle
ground, where even academic contributions like submitting a paper to A+
@@ -81,10 +80,9 @@ But the ones who are doing the contribution and waiting for it to be counted
towards their Degree or Salary should be aware of it and do it.
Collaborative publishing can be very well used and an example of wikipedia can support
-the claim. Acceptance by user community is a validation of novelty. But how the detail
-or contribution is accepted may not always be a measure of novelty
-(some contributions, very novel, might not trigger much response, some
-trivial ones might trigger huge response). So, in order to evaluate novelty and the
+the claim. Acceptance by user community is a validation of novelty. But how the contribution is accepted may not always be a measure of novelty.
+Some contributions, very novel, might not trigger much response; some
+trivial ones might trigger huge response. So, in order to evaluate novelty and the
original contribution, there should be a mechanism which in turn projects
can use to count or evaluate their innovativeness or novelty factor.
This, along with a must do documentation of the contribution in a collaborative
@@ -93,11 +91,11 @@ generated out of the process.
It is not just a matter of accepting FOSS to mainstream academic research, but
more or less bringing back the idea of freedom to the academia. We should prepare
-a draft framework (I don't have much of an idea on how to prepare it). Then should try
+a draft framework. (I don't have much of an idea on how to prepare it.) Then should try
evaluating some recent contributions to some projects on basis of this framework (we can
use SILPA as one of the sample projects). Then present this to the world as a method
to count novelty in collaborative projects without using the normal way of
-status of publication (all FOSS projects maintained by universities or research
-organizations cite their publications to show novelty).
+status of publication. All FOSS projects maintained by universities or research
+organizations cite their publications to show novelty.
\end{english}
\newpage