Translate

Pages

Showing posts with label science. Show all posts
Showing posts with label science. Show all posts

Friday, July 10, 2015

The Path to Professionalism

While chatting with one of my friends this morning, I made a casual remark: "Disillusionment is the first step towards professionalism." Hm. I feel like patting myself for saying that. :)

We all have dreamt of attaining greatness in so many ways! How many of us really get anywhere close to becoming great? Let me talk about myself. At some or the other point in my life, I have wanted to do all of the following:

  • Become a great writer.
  • Become a great scientist.
  • Become a great singer.
  • Become a great artist.
  • Become a great teacher.
  • Become a great engineer/programmer.


I am usually called quite versatile by most people I know, just because I have become all of the above. Just remove the 'great' part of it. In other words, I can't claim greatness in anything as such. But, I have done most of the above list for a very long time, year after year. And I have continued to enjoy doing them. And for that, I claim some credit.

On the way to do anything non-trivial, disillusionment is the first and biggest milestone. Disillusionment comes in various forms. Let me try and list some of them:

  • It's difficult, physically/mentally/financially expensive.
  • It's 5% inspiration and 95% perspiration.
  • If I don't finish it, it slides back to zero in no time. There's no way to build on my earlier work. It requires one heroic effort.
  • I am not that good in it.
  • Nobody cares.
  • The people I used to admire have turned out to be fakes.
  • I have to work with scoundrels who have no work ethics.


Some Examples

Let me give a couple of examples from my experience. 

Being a Scientist

My childhood image of a scientist was that of a chemist mixing colourful fluids with each other and creating funny fumes and explosions. That image was kind of cool. That image didn't have the various other components of chemistry in it: the eternal pungency of a chemistry lab, the taste of chemicals, the Ketone burns, the ever imbalanced chemical equations, the low grades ... While I gradually came to terms with the idea that I didn't have much of a future in Chemistry, I was fortunate to get exposed to other forms of science where my disillusionments weren't severe enough to kill my wish to continue with them. I continued in the path of science, but came across bigger and bigger disillusionments. You have to get good grades every time. You have to publish and keep publishing. You have to be visible. You have to continuously keep working hard to prevent yourself from getting caught in the vortex of anonymity. Sometimes, just in order to avoid being completely ignored, you have to fake interest and knowledge in some subjects. In conferences, you have to flock around the big names of your field. You have to flash smiles, laugh at bad jokes ... sometimes, even try to look good. Anything but what you would associate with a scientist. You have to tell lies about your work to get your papers through in journals: fabricate data, force-fit cool sounding themes, spew mathematical symbols where you could as well have done without them, cite papers of authors who you know are there in the programme committee.

Of course, a counter-argument to the above is: 'You be so good that you needn't do any of the above.' I agree. And I know people who haven't possibly gone down the above path and have still done well. And I know at least one individual who, having not done that well, has still managed to keep himself clean of blemishes -- myself. But that doesn't change the overall DNA of the system: which is ruthlessly competitive, and even corrupt in certain ways. And that makes it a day to day struggle to survive in the world of science. In short, being a scientist today (or may be any time in history) is far from the cool image projected to us of an absent-minded genius free of worldly worries discovering new scientific marvels everyday. A great disillusionment!

Being a Writer

I have been writing for as long as I remember. The wish to write a book that would take the reader into a different world, would show them things they wouldn't otherwise be able to see, occurred quite early in life, as early as the first time any book did the same to me. However, writing has turned out to be a much harder thing to do than my childish fantasies allowed me to perceive. It's not merely about having a great idea to share, and some grasp over a language to express it with. It's a continuous balancing act between brevity, simplicity, relevance, glamour, cohesiveness, and a host of other concerns. Writing a page in a diary and writing a piece for someone else are two completely different things. The first draft is ready in probably the first 10% of the time you spend in writing. The remaining 90% goes in the incredibly tedious process of reading, re-reading, revising, re-arranging etc., an unending struggle to give it that ever-elusive ideal form.

If it's a book, before you even get to write it up, you have to actively involve in networking and canvassing to grab the attention of a publisher. Then, you have to fight him to prevent him from modifying your ideas beyond recognition to make your work 'more acceptable'. Afterwards, you have to go around marketing the book to get it sold. I have personally come across people -- authors -- who employ fairly disgraceful methods of getting called as authors, getting invited as guest speakers, and getting on-air time in well-publicised events just to up the probability of getting the copies of their books being picked up by prospective readers from the book-store racks.

Similarly, in this Internet age, Most readers have a infinitesimal attention span. Clicking a link to a blog post (which is a reaction to a distraction in the midst of work or other competing distractions) is not at all equivalent to picking a book from a shelf (which is an act of volition in more than one ways). So, the time available before the prospective reader gets bored, closes the browser tab and moves on to the next distraction is in the order of a few seconds. You may have to do something really drastic to turn the visitor to your blog into a reader. For example, finishing up in one screen-full, using graphics, bullet points, politically charged statements, even sleaze. An Internet reader has no patience or time for slow build-up, and elaborate analysis. He wants quick pills of instant gratification which a sedately paced work will never give him.

In short, the imaginary world in which you wish to be an author was where a good idea just needed to expressed in order for a reader to read it and appreciate it. In the real world, an author, just like a real world scientist, is competing with many others, for the attention of a reader who is already deluged with distractions. The competition for the readers' mental real-estate is so cut-throat amongst writers that you shouldn't be surprised to find resistance and rivalry instead of intellectual camaraderie between two writers. Another huge disillusionment!

The Reality

The reality is, I am still a scientist. I think about new stuff, I read, I create ... every day of my life, I go through the rigours of a scientist's life like all other scientists of the world. The visions of greatness have faded before my eyes. One may think, I mean to say that now doing science is probably nothing but a drudgery for me. I do it because I am left no other choice at this stage of my life.

The reality also is that I keep writing copious amounts. Blogs, notes, diary, letters. I have stopped likening myself to Tolstoy, Premchand and Tagore. I don't even clearly visualise my words in print form any more. Yet, I keep writing.

One may think all this quite grim. I don't. I happily feel that I have made it across the most difficult part. I have made it across the ocean of disillusionments. This journey has left me humbled, but also clear about why I do the things I do. I am clearer about what I am prepared to do and what not in order to make advances towards possible acceptance, if not greatness, in these worlds of activity: scientific or literary. I now have solid evidence that I don't do my stuff because of a fantasy of glamour and greatness. Nor merely as a means for subsistence. Rather, there's something right there in the act of doing which pulls me irresistibly to these acts, and to do them everyday, apparently thanklessly.

A professional is a person who has survived the tempests of disappointment, depression and despair, voyaging in the ocean of disillusionments. He is a person who is driven by the very love of the act he earns his bread from. In fact, I would go to the extent of asserting that unless you go through the process of seeing layers of glamour and hopes of greatness being peeled off your profession, you don't even get called a professional in it.

Thursday, August 06, 2009

Flexibility in Education -- A Thought for Future

As I see my child growing day by day, it's a vivid experience tracking his progress. There are things he is quick to learn -- babbling, meditating on something, relishing food .... There are things in which he seems to be falling back when compared with his peers (other kids born within a span of a few weeks) -- grabbing things, turning on his tummy, sitting up etc. Comparisons are always being made. Wherever there seems to be a backlog, it tends to trigger negative thoughts. Elder and experienced people are quick to settle that matter by saying, "Hey! All children are different. He'll learn. Eventually. Let him go at his own pace."

For some months now, I have been turning the pages of books on probability models. Fortunately, my job affords me chances to go back to textbooks and learn things I never tried, or had tried but had given up or was made to give up. Probability used to occupy a part of our mathematics curriculum every year after 9th standard for all the years that had a mathematics paper. I think I was fairly good in probability then, but wasn't exceptional. I learned it to a certain extent that was possible for me during those math courses. After those courses, I hardly ever got a chance to study probability until recently. There were other friends of mine who didn't show an initial aptitude towards probability. They got a few years to demonstrate a growth in their aptitude. Most of them ended at the same level where they had started. They never studied probability after that. Unlike me, they will never study it again.

Perhaps, to a large extent, that's a thankful thing. We ought to be spared of having to struggle with the same set of subjects forever even though we show no inclination or necessity to learn them. There should definitely be a quick and efficient method in place to identify the natural gifts of a child, and propel him in that direction with all resources possible. But there is another side to it. We don't develop our aptitude for certain subjects at the same time. Each child develops his own way of learning. Certain subjects which appeared like Greek in my adolescence now appear easier to grasp when I pick them up after a while. I can upfront think of a few reasons: one is experience which prepares us to look at the subject from another angle. Another is the fact that over years, we develop techniques of thinking and reasoning. Arguments which might have appeared exotic to me when I was 18 now appear quite mundane to me.

Children pick up life-skills at various paces, in different order. Most of them finally arrive. It's not more probable that a child learning to walk early is not more probable to become more athletic than his peer who learns to walk later. An early talker isn't more likely to become an orator than the one who learns to speak a little later. Nature doesn't put us in rigid curricula where we learn our subject along with our peers at a predesignated period of our life; in which we don't get another chance to learn once we miss the first few.

It would be good to learn our subjects in a slightly more flexible way too. If I don't understand probability in my 9th grade, can I take it in my 11th? If, during my 5th grade, I show a strong propensity towards literary skills, can my curriculum be enriched with language and literature subjects, scheduling my other subjects for a later coverage? If before that I show a prodigal aptitude for literature, I may simply be spared of doing my science courses and allowed to grow in my natural order at a much accelerated pace. If I flunk math due to bad performance in some of the modules, can I be allowed to move on, keeping those modules for a repeat visit at a later point in my student-life?

In short, I am talking about flexibility. The tyranny of perpetual comparison with peers will be broken. The growth of the child will be with the direct intent of making him an employable citizen depending on his aptitude. The child will get an almost unlimited opportunity to learn subjects in a customised order. Moreover, there will be possible to maintain a much finer grained profile of the student's strengths. For example, currently it's impossible to know if a student of commerce had, at one point in time, shown exceptional calibre in problems of graph theory. Then, such profiling will be possible. Straight-jackets of science, commerce and art streams are outdated and rotten. This system will allow each student for designing his own stream. Students will seek absolute excellence. Competence (being good in something) will not confused with competiveness (winning games and wars).

Looking closely, it's hardly a revolutionary idea. I see subdued forms of it in the current system of education, particularly in higher education. To implement this idea in all its glory, we need a much more developed way of assessing a child's progress. It may be expensive to implement as it will obviously call for more attention to be given to the individual growth of each student.

May be, something of this sort will work out for a future society. Your inputs please! Particularly, as to how this proposed system could be broken.

Wednesday, April 15, 2009

Three Pillars of Enlightenment

We bought a farewell gift of three books -- one on popular science, one on Bhagwad-Gita, and one a humour novel by Wodehouse -- for one of our colleagues who spent his last day in our company today. The following message (edited) was written to bring them together:

Science: The disciplined approach of understanding everything through logical deduction, experiments and sensory observation.

Metaphysics: There lie truths beyond the boundaries of our 5 senses. Where science goes mute, mystical metaphysics becomes our guide.

Humour: Zen masters say that an acute sense of humour is the highest form of intellect; and a light-hearted laugh, the highest form of spiritual bliss.

Thursday, February 22, 2007

Arguing for Objective Definition

Abstract
This article talks about my idea about the relation of objectivity and science. After dwelling a bit upon how objectivity is the holy grail of scientific practice, we take the opposite stand wherein viewing and interpretation is not a matter that has been ignored and hushed up, but a subject of fascinating studies. We then discuss the question of what is the most significant characteristic of science. The purpose of this discussion would not specifically be to answer the question, but to analyse the undercurrents of the questions of objectivity in that. Finally, we make some simplistic prescriptions as to how scientists would make better scientists, not by hushing up the inherent subjectivity of their trade, but by capitalising upon it.

1 Objectivity in Science
“Not influenced by personal feelings or opinions, considering only facts.” That is the definition of the term ‘objective’ in Oxford dictionary. (Natural) Science (hereon referred to as just science) has come up as the de facto instrument of human race to create a body of knowledge which stands on such rigorous assessment that the possibility of its ever being contradicted can be
reduced to practically nil. The proof of this rigour has two vehicles – formal theory and experiments. In fact, the emphasis on the non-contradictability of scientific knowledge is so much that it has been decided to gradually shunt certain questions – by no means less interesting and challenging than any other – out of the purview of science study, at least for the present. These questions do not lend themselves well to a rigorous scientific study – at least
at the present time – due to the presence of one or more attributes which are difficult, if not impossible, to measure. Examples of such attributes – beauty, emotions, welfare etc. Such questions fall in the purview of arts and social sciences. We will devote a later section on developing more elaborate arguments about what could be a precise definition of science. For the present, the point of the matter is the stress laid on the need for objectivity in science.

What is behind the obsessive pursuit of objectivity in science? Philosophical beliefs of positivism, realism, dualism and idealism interplay with each other here to point us to the possible answer. In essence, the pursuit of science seems to rest on the belief that there is one unified truth about
everything. It may or may not be known or even knowable, but its independence from the knower is complete. That truth is absolute. There is an implicit claim in the idea of science that science is pursuit of the knowledge of that absolute truth.

2 Beyond Objectivity
We observe myriad things and events which are interesting and intriguing. However, their interestingness may often be associated with something beyond our power of measurement. Science refrains, in general, in claiming a central role in understanding such subjects. However, it hasn’t stopped any interested soul from pursuing such subjects. Let us use this section in iden-
tifying a set of such subjects. One important inference that can be drawn from this part of the discussion is that objectivity does not always count as the most important aspect of a scholarly pursuit.

2.1 Aesthetics and Art
Beauty, for instance, is intuitively a universally sensed attribute of various things. However, its actual immeasurability depends not only on its subjectivity, but also on the intractability of its physical origins. For example, it may be argued that a painting is beautiful owing to the faithfulness of its details to how people see the world. This isn’t true about paintings which
intentionally use effects which don’t add to the its realisticness, and yet add beauty value to it. The absence of this realisticness aspect is even sharper in a beautiful cartoon, where the artist’s achievement is in distorting the real world to the utmost to evoke humour. Perhaps, the beauty of a cartoon lies in the cartoonist’s ability to capture the essential features of a object, and
distorting it to a ridiculous degree, yet preserving the essential characteristic of the subject. The beauty might be tentatively identified as anything that evokes so called positive sensations in the onlooker, e.g. humour or surprise or admiration. However, that may not be true if we consider that works of art – drawings, movies, music etc. – demonstrating morbid things like death
and suffering earn tremendous accolade as excellent pieces of art. To say that they lack beauty would not be acceptable at all. Then perhaps, one would say that beauty resides wherever there is creativity and sensitivity involved in an act. Quite surprisingly, beauty of nature is the most undesputably beautiful thing. It does not require a believer to appreciate this beauty (though this statement could be disputed). Therefore, beauty is not necessarily associated with only a creation.
We are far from equipped to state with any confidence what causes the presence of beauty in something, let alone, to measure it with any degree of precision. Yet, this question has intrigued people for centuries. And the urge to create beautiful things is one of primal urges present in varying quantities in all human beings. Aesthetics is the branch of philosophy which
tries developing an understanding of the nature of beauty, particularly in works of art. The practice of creating beauty is art.

2.2 History
Among all things man wants to know, his past is one. Man’s wish to know everything of the events that led to him might well have been the dominant reason for him to start studying history. Yet, history has evolved far from merely a subject which just tries to reconstruct the past. The concept of historical thinking or historical reasoning points to a skill of reasoning in
historians quite different in nature than scientific thinking. It is an interesting matter to ask why historical thinking must be sanctioned a different form. In my view, the answer partly lies in the sparseness of concrete evidences using which history is created. An account of an event that happened far removed in time and space from now and here can be expected to have lost many of the threads that lead it to us, and now could lead us back to it. However, the onus on the historian is an ambititious one – to construct a story out of those evidences. This story involves not just plain physical elements but human factors which are far more complex than anything else. Even in this information age, to discover the exact links of causality between various
interesting events is difficult. If we temporarily give space to the (dubious) assumption that societies of the past were simpler than the present, still their complexity was demonstrably daunting. Is it, therefore, hopeless to look backward as history does? Critics of history have strong arguments to provide. As per them, History neither qualifies as an art, nor as a science. Its interpretation are subjective and value laden. Historians have been charged of trivialising the individual man by placing disproportionate importance to the evolution of the human society. The worst aspect which is almost a conventional wisdom is that the chronicles created at every point, which later act as the principal tools of a historian, are at the mercy of the ruling powers of the age and country. Such documents are liberally used by many to serve their ulterior motives. How does one make use of such adulterated documentation to create an accurate account of the past?

Despite all these negative factors, study of history remains an interesting thing to do. The subjectivity – to some extent, even the sensationalisaton – of the accounts, instead of being detrimental to the cause of the study of the past, has turned into an effective method to create interesting narratives which can then be discussed in a lot bigger council of historians. Finally,
this should create consensus about the past – a simplistic, but necessary assumption.

2.3 Literary Interpretation
Literary criticism is the application of literary theory – a subject which deals in developing understanding about literary work. The central questions revolve around : What is literature? Good literature and bad literature? Various aspects of a literary work? The role of a critic?
Literary criticism as a practice is as old as literature itself. Its importance cannot be overstated, going by the fact that that there are now authorities on whom a general consumer of literary work can rely for informed and balanced review of literature. Like any study that tries to theorise an art, literary theory can be criticised for trying the place works of literature into
pigeon-holes. However, like any other field of scholarly study, literary theory has also developed tremendously, and is an approximation of the ideal of a complete and correct theoretical infrastructure for evaluation of literary quality.

From the point of view of our current discussion, the history of literary theory discloses an all pervading element of subjectivity in the process of criticism. The question of the role of the critic in fulfilling the meaning of a literary work is vividly studied. In a sense, a peculiar aspect of literature comes out : A literary work gets its many meanings when it goes into the
hands of its many readers. The subjective aspect of literature is an inherent and essential one. That’s a point rigorously accentuated by the subject of literary criticism.

3 How Objective is Science?
In this section, I summarise the ideas that raise an important question about Science: How objective is science, after all? These ideas have often come from authoritative figures in the world of science. This demonstrates that science, in its highest attire, perhaps never claims a total lack of subjectivity.

A good scientist doesn’t just accept the presence of subjectivity, but often seems to accept that its presence is an essential ingradient of what could be called science. Of the innumerable evidence of this widely recognised fact, I rather expectedly have had access to some which are close to my area of interest, Computer Science.

3.1 Knuth – The Art of Computer Programming
Donald Knuth, in his Turing lecture [1], spent a considerable hour emphasising how computer programming is as much an art as it is a science. The central argument of the lecture is that even though computer programming is now largely a science, it is an artistic activity at its core. Even though our understanding of what it means for a program to be correct becomes more
elaborate, programming itself is not on its way to dissociate itself with good taste, beauty and divergent thinking.

3.2 Penrose – The Emperor’s New Mind
The Emperor’s New Mind [3] is a monumental popular science book written by the famous Oxford scientist Roger Penrose. His central argument in this book is to convince the reader that with the given state of the art in our knowledge of Physics, it is impossible to build computers that could possibly think. The many arguments in this book are borrowed from various fields
– Theory of Computation, Quantum Physics etc. – all showing that it is fallacious to say that computers of the current day indeed do anything like thinking in the proper sense of the word.
A dominant theme of this book also expresses that critical aspects of human thinking are not algorithmic in nature. The book can’t be quoted as a book against objectivity as such, but it does disavowes the notion that mechanisms of human thinking, as we know it, are understood well-enough that they could effectively be mimicked by any machine in the short future.
The roots of the most critical features of human thinking – creativity, imaginativeness and visualisation – lie beyond what has been discovered about the physical world.
I dare to add that an important indication is that when that (or those) breakthrough understandings about the physical world do unravel themselves to us, our notion of objectivity will not remain untouched by that breakthrough.

3.3 Proof As a Social Process
A very active area of research in computer science is verification, wherein people create tools and methods to formally prove that a computer program meets its requirement. The motivation comes from the need of significantly enhanced confidence in a program’s correctness and quality. The central claim of the proponents of verification is that if a program can be automatically and undesputably proved to be correct, they will be of a higher quality, and the process of software development would be much faster and enjoyable. Alan Perlis, the first ACM Turing Award winner, writes in his 1979 paper Social Processes and Proofs of Theorems [2], how erroneous the basic assumptions of verification proponents are. Perlis et al. go on to argue that there is no interesting mathematical proof which is accepted and used without having gone through an elaborate social process. They claim that proofs are a very essentially human activity, involving immense stress, strong emotions, publicising, reviews and so on. Opposed to this, a verification proof would be an unreadable piece of automatically generated symbols, which nobody would be able to read in the first place. Consequently there is no hope of an automatically generated program verification proof to ever take the place of a mathematical proof. The field of verification has made significant strides since the above paper, though the progress has not significantly disputed the above claim. The interesting point from the point of view of this article is the stress that the authors put on the social nature of mathematical proofs of theorems and
hence programs. There is a strong indication that the authors believe that the activity of proving things is not algorithmic or mechanical; it’s inherently human.

3.4 The Lost Knowledge Argument
The superior credibility that science has earned for itself is perhaps not to be disputed. What could be questioned is the role of objectivity in that. The most respected findings of science have been simple, often intuitive. They have also been proved wrong after surviving for several centuries. The most rigorous theorems in science are at best cursorily understood by the wide
scientific community. Their details are often way beyond the comprehension of all but a handful working in that area. This puts a serious question on the universal claim of objectivity that is made in science. The barrier of tremendous complexity protects many scientific claims from both being understood and from being seriously questioned by a large number of people.
It becomes a proprietary property of sorts of its owners. How does such a piece of knowledge claim to be objective?
One argument in favour of objectivity would be: If you try hard enough... Let us examine this argument a bit closely. The statement is inherently value laden. Not only does it rest on an unmeasured quantity of ‘hard enough’, there is a subtle menace communicated which seems to glare negatively at both not trying hard enough (insincerity), and not being able to try hard enough (lack of intelligence). The fact that mathematical rigour is inaccessible to a large portion of the population should not be waved away as a state of affair, but should be given due importance in the definition of objectivity.
I am repeatedly tempted to liken scientists with mystic gurus of the orient who professed to have seen the truth. If you wondered to them: ‘Hey! But I can’t see it!’ They too would perhaps say, ’If you try hard enough...’ It is an important question to ask as to what really differentiates the mystic’s claim from a mathematician’s claim. While the mystic claims to have developed
a perception for things beyond the limits of our usual senses, a mathematician seems to have sharpened his skill of looking through the mathematical symbols to seeing what they mean. Unless, the difference between a mystic’s and a mathematician’s capabilities is well understood, the debate on the objectivity of science is far from over.
Another strong argument in favour of objectivity is: In case of science, there are at least some experts who can work the proofs out right from the start to the end. This is not true for mystical subjects, or subjects like astrology.
Try looking at the argument from the other side. What happens if, of our five senses (sight, smell, hearing, touch and taste), one of the dominant ones (say, sight) disappears? What would happen to the large body of scientific knowledge that requires the existence of this faculty. Were this faculty to disappear from the face of the Earth, there would be no way by which this
category of knowledge (in which, I am sure, a better portion of scientific knowledge can be put, directly or indirectly) could be proved or disproved. There would remain no expert on such vast volumes of knowledge. Would such knowledge continue to remain scientific? The situation appears contrived, but raises a fascinating question: Is a piece of knowledge objective owing to some inherent attribute of it, or due also to certain circumstantial parameters? It appears to me that the answer is the latter. Scientific knowledge of today could lose their status as scientific knowledge if certain circumstances were to arise. Or else, scientific knowledge would have to forsake its claim that at least some experts exist who can prove their truth. The existence of a sufficient number of such experts is not an inherent property of the knowledge; it’s partly circumstantial.

4 Conclusion – Science and Subjectivity
All the above briefly brings forth the idea that the practice of science has existed and excelled on many more factors than objectivity. A significant amount of creativity and taste categorically establish the artistic side of science. Therefore, it would be right to say here that instead of sweeping its subjective aspects under the rug, science should learn to accept them openly. I would even suggest that some degree of subjectivity would play an important role in making science more interesting to do, and accessible more widely. Follows a couple of prescriptions as to where such subjective elements would work in service to science. They may be simplistic and unpolished, but surely not grossly in the wrong direction.

4.1 Dialogue as a Communication and Pedagogical Tool
The ostensibly subjective nature of the subjects of interest in social sciences makes it a particularly challenging task to communicate the exact thought. The language of mathematics, which proves a fairly effective medium of communication in many of the scientific subjects, falls short in its expressivity in matters of social sciences. The activity of knowledge communication takes a very interesting form here: that of dialogue. There are long discussion on a matter, primarily not to work out details of an artifact, but to refine the understanding of a concept and create a consensus on the matter by establishing a common language (vocabulary and grammar). Such an exchange predates on looking at an apparently subjective issue from its various angles, weighing opinions, and likewise refining the language of discourse. Studying this method of communication would be of significant value to teachers of science. Scientific pedagogues struggle to make their otherwise dry mathematical subjects interesting by employing methods of exchange which are used rather commonly in social sciences, and have been developed to a very advanced level in such disciplines. Studying their methods would be of pragmatic value to scientific communicators. The tricks of these methods involve a narrative style, often studded with expertly moderated discussions between the instructor and the audience, among other things. Most scientists are woefully ignorant of such effective means of evoking interest in their audience. Somewhere deep down, all these tricks are rooted at a conventional appreciation of the inherent subjectivity of the subject, and also of the fact that the participants of a scholarly interchange are often human-beings. Their interpretation is by nature subjective.
The only effective methods of catalising such interchanges are not more and more excruciating details, but also those narrative styles, which scientists often dismiss as decorative.

4.2 Widened View
Science is done well if it involves focus. But it’s done better when there is a degree of widening of view. Science, despite its ivory tower self image has its takers, like anything else. It makes sense to situate oneself in a wider social context so the efforts are more goal-driven. The setting of context should happen in various ways. Something which is known as market research at the business level can be done also in the scholarly level by viewing one’s work as a part of a wider project. It would be too bold to claim that involving subjectivity in our world-view should bring in a revolution in scientific results, but surely, it could revolutionise the way we do science.

References
[1] D. Knuth. Computer programming as an art. In Communications of the
ACM, pages 667673–357. ACM Press, May 1974.
[2] R. A. D. Millo, R. J. Lipton, and A. J. Perlis. Social processes and proofs
of theorems and programs. Communications of the ACM, 22(5):271–280,
1979.
[3] R. Penrose. The Emperor’s New Mind. The Oxford University Press,
1989.