summaryrefslogtreecommitdiffstats
path: root/evaluatingfosscontributions.tex
diff options
context:
space:
mode:
authorkra3 <kra3@kra3.in>2012-07-01 20:12:57 +0530
committerkra3 <kra3@kra3.in>2012-07-01 20:12:57 +0530
commit7749530d926180931d3eb415f6f2eb6f06607280 (patch)
treef8916c3d489593e14697e59162e55a5f9de7162e /evaluatingfosscontributions.tex
parent5299386e023f2d501dff6fd2efb19569c2cb8e53 (diff)
downloadlogbook-of-an-observer-7749530d926180931d3eb415f6f2eb6f06607280.tar.gz
logbook-of-an-observer-7749530d926180931d3eb415f6f2eb6f06607280.tar.xz
logbook-of-an-observer-7749530d926180931d3eb415f6f2eb6f06607280.zip
added post evaluating foss contributions
Diffstat (limited to 'evaluatingfosscontributions.tex')
-rw-r--r--evaluatingfosscontributions.tex91
1 files changed, 91 insertions, 0 deletions
diff --git a/evaluatingfosscontributions.tex b/evaluatingfosscontributions.tex
new file mode 100644
index 0000000..cc9496f
--- /dev/null
+++ b/evaluatingfosscontributions.tex
@@ -0,0 +1,91 @@
+\section*{Evaluating FOSS Contributions}
+\vskip 2pt
+
+Counting FOSS contributions towards research grants actually threw open
+a new area of investigation as such I guess. Its about evaluation of novelty
+factor of contribution to FOSS project. As we know, in any field of research,
+evaluation metrics is a big area of investigation and people come up with new
+distances and measures every now and then(even we are in the middle of
+such an effort for OCR). Normally novelty factor is decided based on where
+the related paper is published, how good it is explored and how good is the
+theoretical foundation for the papers. The interesting thing is, many FOSS
+projects cite papers published(and some regularly publish papers) in main
+stream journals to get acknowledged for their novelty and to ensure the novelty of
+the algorithms they use. Many times authors note use in FOSS projects to
+report on real time usage, easier adaptation etc.(many plugins in GIMP are
+developments of European univ. PhDs, famous one is, a resynthesizer plugin).
+But the idea of evaluating the novelty factor solely considering the contribution
+to the project require a new metric to evaluate it too. The normal factors
+required to assert novelty is all present in a collaborative project very
+much in line with the normal lab, professor, conferences, peer reviews mechanism.
+The only difference and a crucial missing factor will a published(i would
+say inaccessible too, a paper costs 5-10 USD depending on the publisher,
+conference and journal) covering paper. Peer review of the technique,
+its implementation in real time projects and required documentations as
+part of the project documentation and various blog and log entries are
+available. Peer review of subject experts happen very well in discussions
+over IRC and mailing lists(most of which are archived). Different perspectives
+from theoretical foundations to practical implementation issues were discussed
+in a single go there(depends on the project too). But this varies from project
+to project. A project or a contribution which generates a bigger discussion
+and criticized and evaluated rigorously should get more points(very similar
+to classification of conferences and journals to A +,A,etc.). We can even
+have a FOSS project classification depending on how much discussions, scrutiny
+and perspectives are evaluated before new features are incorporated into the
+existing system. Also another thing we can draw parallel is the criterion used
+by conferences and journals for accepting a paper and the peer review system
+of the projects. These are some ideas got into my mind, when thought about a
+systematic evaluation metric for novelty in FOSS contribution. A metric
+and system like this will help to counter so many software patents too i guess.
+There is Special Interest Group of ACM for Computer Science Education.
+They have a special section on FOSS. I haven't seen the proceedings so
+don't know what all they discussed. But it will be good to check these
+formal forums and their proceedings to look for prior ideas on the subject.
+I don't have access to ACM libraries here. If we can put some time and
+thought into this, we can develop a draft and then may be start an open
+discussion too. This will help FOSS projects to avoid depending on non
+free published items for claiming the novelty factor due for them(since
+Santhosh is not interested in publishing, he is not recognized by anyone
+in academia of Indian Language Research though his works are very popular).
+
+
+I think what ever i wrote was taking it apriori that acceptance of ideas
+to a project is enough for validation. My problem was how do we evaluate
+the novelty factor(we know there is novelty factor, but how to scale it).
+Then later on turning this novelty factor itself to rate the projects.
+Now projects interact with academia in a weird way, we should find a middle
+ground, where even academic contributions like submitting a paper to A+
+journal is counted similar to adding the same algorithm with all its detail
+to a project with A+ novelty factor. People might not accept it as such,
+and at first, there will be double contributions, but with enough campaigning
+and ensuring that the evaluation framework is strong enough and thus reliable,
+we can make some progress. It will also work as a counter measure to now
+monopolistic attitude of IEEE,ACM etc. in case of academic publishing.
+Only thing i worry about is the arguments against the review of documentation
+(like how implementation of something in one project will ensure its
+reimplementation capability in different scenario if the documentation
+is not aimed at that). Capability to reimplement and produce results for
+a different set of users for a different set of purposes should also carry
+weightage(like how much help does this implementation give on doing that).
+That usually doesn't come under the aims of project and they don't care,
+but the ones who are doing contribution and waiting for it to be counted
+towards their Degree or Salary should be aware and do it. Collaborative
+publishing can be very well used and example of wikipedia can support the claim.
+Acceptance by user community is a validation of novelty. But how the detail
+or contribution is accepted may not always be a measure of novelty
+(some contributions, very novel, might not trigger much response, some
+trivial ones might trigger huge response). So to evaluate novelty and the
+original contribution, there should be a mechanism which in turn projects
+can use to count or evaluate their innovativeness or novelty factor.
+This along with a must do documentation of the contribution in a collaborative
+peer reviewed wiki kind of system should ensure freedom of the knowledge
+generated out of the process. It is a matter of not just accepting FOSS to
+mainstream academic research, but more or less bringing back the idea of
+freedom to the academia. Should prepare a draft framework(i don't have
+much idea on how to prepare it). Then should try evaluating some recent
+contributions to some projects on basis of this framework(we can use SILPA
+as one of the sample projects). Then present this to the world as a method
+to count novelty in collaborative projects without using the normal way of
+stat of publication(all FOSS projects maintained by universities or research
+organizations cite their publications to show novelty).
+\newpage