Thursday, July 17, 2014


Syntax

In linguistics, syntax (from Ancient Greek σύνταξις "coordination" from σύν syn, "together," and τάξις táxis, "an ordering") is "the study of the principles and processes by which sentences are constructed in particular languages."

In addition to referring to the discipline, the term syntax is also used to refer directly to the rules and principles that govern the sentence structure of any individual language. Modern research in syntax attempts to describe languages in terms of such rules. Many professionals in this discipline attempt to find general rules that apply to all natural languages.

Early history

Works on grammar were written long before modern syntax came about; the Aṣṭādhyāyī of Pāṇini (c. 4th century BC) is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory.[2] In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.

For centuries, work in syntax was dominated by a framework known as grammaire générale, first expounded in 1660 by Antoine Arnauld in a book of the same title. This system took as its basic premise the assumption that language is a direct reflection of thought processes and therefore there is a single, most natural way to express a thought.

However, in the 19th century, with the development of historical-comparative linguistics, linguists began to realize the sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought, and therefore logic could no longer be relied upon as a basis for studying the structure of language.

The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of the Port-Royal Logic were copied or adapted from the Grammaire générale.) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "Subject – Copula – Predicate." Initially, this view was adopted even by the early comparative linguists such as Franz Bopp.
The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001).

Character Set


A defined list of characters recognized by the computer hardware and software. Each character is represented by a number. The ASCII character set, for example, uses the numbers 0 through 127 to represent all English characters as well as special control characters. European ISO character sets are similar to ASCII, but they contain additional characters for European languages.

Syntax GroupClick the link above

Designing Programming Languages for Reliability

his paper contains several comments and thoughts on designing programming languages so that programs tend to be more reliable. It is organized as a list of suggestions for anyone who is about to design a new language. Introduction
The Ariane-5 was a reusable space vehicle designed and manufactured in Europe. On its maiden flight, the vehicle was destroyed in the launch phase, due to a software error. An unhandled floating-point overflow was the problem. The software was written in Ada, the language which, several years ago, the U.S. Defense Department decreed was to used for all military systems.

Software errors have killed innocent people. In an early infamous example, a computer controlled the X-ray dosage in a medical machine used to treat cancer patients. A programming error caused the machine to give lethal doses during radiation treatment, instead of the much smaller doses actually prescribed. Unfortunately, the lethality of the doses was not immediately detected and several patients were affected.
On a smaller scale, home computers are crashing all the time. Perhaps you too have been annoyed by an unexpected, unexplained failure of software that you felt ought to function correctly.

Software reliability is clearly important and much research has been done on how to increase reliability. Nevertheless, software quality remains lower than most people find acceptable. What is the problem?
First, we must realize that reliability is costly and not every program requires the same level of reliability. The problem isn't that we can't produce programs with the desired level of reliability but that we don't put an appropriate emphasis on it. This is fundamentally an economic issue, not a technical issue.
Second, most programmers enjoy programming and the goal of making programming more enjoyable is often directly opposed to the approaches for making programs more reliable.
Third, some of the theoretical work on program correctness is too abstract. It may be too difficult for many programmers to understand and too costly for organizations to use. It may be overkill. For many applications, we want a small, incremental increase in reliability for a small, incremental increase in programmer effort.

My feeling is that the programming language itself has a huge impact on reliability. This document contains a number of thoughts on how we can design languages to encourage correct programming.
I'll show several examples in C and C++. I don't mean to pick on C and C++ in particular. After all, it was clearly stated in the design of C and C++ that tradeoffs were always made in the favor of efficiency and that the languages were inherently dangerous and difficult and meant only for experienced programmers, who were instructed "buyer beware." I'll use C and C++ since the problem of bugs is worse in those languages than other, more modern languages and since they are so widely known and used today.

Click here for more reference.

Posted on Thursday, July 17, 2014 by Unknown

No comments


A Compiler and Interpreter both carry out the same purpose – convert a high level language (like C, Java) instructions into the binary form which is understandable by computer hardware. They are the software used to execute the high level programs and codes to perform various tasks. Specific compilers/interpreters are designed for different high level languages. However both compiler and interpreter have the same objective but they differ in the way they accomplish their task i.e. convert high level language into machine language. Through this article we will talk about the basic working of both and distinguish the basic difference between compiler and interpreter.

Compiler
A compiler is a piece of code that translates the high level language into machine language. When a user writes a code in a high level language such as Java and wants it to execute, a specific compiler which is designed for Java is used before it will be executed. The compiler scans the entire program first and then translates it into machine code which will be executed by the computer processor and the corresponding tasks will be performed. 

 

Interpreter

Interpreters are not much different than compilers. They also convert the high level language into machine readable binary equivalents. Each time when an interpreter gets a high level language code to be executed, it converts the code into an intermediate code before converting it into the machine code. Each part of the code is interpreted and then execute separately in a sequence and an error is found in a part of the code it will stop the interpretation of the code without translating the next set of the codes.

Posted on Thursday, July 17, 2014 by Unknown

No comments


Programming language theory (PLT) is a branch of computer science that deals with the design, implementation, analysis, characterization, and classification of programming languages and their individual features. It falls within the discipline of computer science, both depending on and affecting mathematics, software engineering and linguistics. It is a well-recognized branch of computer science, and an active research area, with results published in numerous journals dedicated to PLT, as well as in general computer science and engineering publications.

History

In some ways, the history of programming language theory predates even the development of programming languages themselves. The lambda calculus, developed by Alonzo Church and Stephen Cole Kleene in the 1930s, is considered by some to be the world's first programming language, even though it was intended to model computation rather than being a means for programmers to describe algorithms to a computer system. Many modern functional programming languages have been described as providing a "thin veneer" over the lambda calculus,[1] and many are easily described in terms of it.

The first programming language to be proposed was Plankalkül, which was designed by Konrad Zuse in the 1940s, but not publicly known until 1972 (and not implemented until 1998). The first widely known and successful programming language was Fortran, developed from 1954 to 1957 by a team of IBM researchers led by John Backus. The success of FORTRAN led to the formation of a committee of scientists to develop a "universal" computer language; the result of their effort was ALGOL 58. Separately, John McCarthy of MIT developed the Lisp programming language (based on the lambda calculus), the first language with origins in academia to be successful. With the success of these initial efforts, programming languages became an active topic of research in the 1960s and beyond.

Posted on Thursday, July 17, 2014 by Unknown

No comments

Early History

The first programming languages predate the modern computer.

During a nine-month period in 1842-1843, Ada Lovelace translated the memoir of Italian mathematician Luigi Menabrea about Charles Babbage's newest proposed machine, the Analytical Engine. With the article she appended a set of notes which specified in complete detail a method for calculating Bernoulli numbers with the Analytical Engine, recognized by some historians as the world's first computer program.

Herman Hollerith realized that he could encode information on punch cards when he observed that train conductors encode the appearance of the ticket holders on the train tickets using the position of punched holes on the tickets. Hollerith then encoded the 1890 census data on punch cards.

The first computer codes were specialized for their applications. In the first decades of the 20th century, numerical calculations were based on decimal numbers. Eventually it was realized that logic could be represented with numbers, not only with words. For example, Alonzo Church was able to express the lambda calculus in a formulaic way. The Turing machine was an abstraction of the operation of a tape-marking machine, for example, in use at the telephone companies. Turing machines set the basis for storage of programs as data in the von Neumann architecture of computers by representing a machine through a finite number. However, unlike the lambda calculus, Turing's code does not serve well as a basis for higher-level languages—its principal use is in rigorous analyses of algorithmic complexity.

Like many "firsts" in history, the first modern programming language is hard to identify. From the start, the restrictions of the hardware defined the language. Punch cards allowed 80 columns, but some of the columns had to be used for a sorting number on each card. FORTRAN included some keywords which were the same as English words, such as "IF", "GOTO" (go to) and "CONTINUE". The use of a magnetic drum for memory meant that computer programs also had to be interleaved with the rotations of the drum. Thus the programs were more hardware-dependent.

To some people, what was the first modern programming language depends on how much power and human-readability is required before the status of "programming language" is granted. Jacquard looms and Charles Babbage's Difference Engine both had simple, extremely limited languages for describing the actions that these machines should perform. One can even regard the punch holes on a player piano scroll as a limited domain-specific language, albeit not designed for human consumption.

Modern History

1990s: the Internet age The rapid growth of the Internet in the mid-1990s was the next major historic event in programming languages. By opening up a radically new platform for computer systems, the Internet created an opportunity for new languages to be adopted. In particular, the JavaScript programming language rose to popularity because of its early integration with the Netscape Navigator web browser. Various other scripting languages achieved widespread use in developing customized application for web servers such as PHP. The 1990s saw no fundamental novelty in imperative languages, but much recombination and maturation of old ideas. This era began the spread of functional languages.

A big driving philosophy was programmer productivity. Many "rapid application development" (RAD) languages emerged, which usually came with an IDE, garbage collection, and were descendants of older languages. All such languages were object-oriented. These included Object Pascal, Visual Basic, and Java. Java in particular received much attention. More radical and innovative than the RAD languages were the new scripting languages. These did not directly descend from other languages and featured new syntaxes and more liberal incorporation of features. Many consider these scripting languages to be more productive than even the RAD languages, but often because of choices that make small programs simpler but large programs more difficult to write and maintain.

Nevertheless, scripting languages came to be the most prominent ones used in connection with the Web.
Some important languages that were developed in this period include:

Posted on Thursday, July 17, 2014 by Unknown

No comments


Overview

Different languages have different purposes, so it makes sense to talk about different kinds, or types, of languages. Some types are:
  • Machine languages — interpreted directly in hardware
  • Assembly languages — thin wrappers over a corresponding machine language
  • High-level languages — anything machine-independent
  • System languages — designed for writing low-level tasks, like memory and process management
  • Scripting languages — generally extremely high-level and powerful
  • Domain-specific languages — used in highly special-purpose areas only
  • Visual languages — non-text based
  • Esoteric languages — not really intended to be used 
These types are not mutually exclusive: Perl is both high-level and scripting; C is considered both high-level and system.
Other types people have identified: Toy, Educational, Very High-Level, Compiled, Interpreted, Free-Form, Curly Brace, Applicative, Von Neumann, Expression-Oriented, Persistent, Concurrent, Glue, Intermediate, Quantum, Hybrid.

Machine Code

Most computers work by executing stored programs in a fetch-execute cycle. Machine code generally features
  • Registers to store values and intermediate results
  • Very low-level machine instructions (add, sub, div, sqrt)
  • Labels and conditional jumps to express control flow
  • A lack of memory management support — programmers do that themselves 

Assembly Language

An assembly language is basically just a simplistic encoding of machine code into something more readable. It does add labeled storage locations and jump targets and subroutine starting addresses, but not much more.

High-Level Languages

A high-level language gets away from all the constraints of a particular machine. HLLs have features such as:
  • Names for almost everything: variables, types, subroutines, constants, modules
  • Complex expressions (e.g. 2 * (y^5) >= 88 && sqrt(4.8) / 2 % 3 == 9)
  • Control structures (conditionals, switches, loops)
  • Composite types (arrays, structs)
  • Type declarations
  • Type checking
  • Easy ways to manage global, local and heap storage
  • Subroutines with their own private scope
  • Abstract data types, modules, packages, classes
  • Exceptions 

System Languages

System programming languages differ from application programming languages in that they are more concerned with managing a computer system rather than solving general problems in health care, game playing, or finance. System languages deal with:
  • Memory management
  • Process management
  • Data transfer
  • Caches
  • Device drivers
  • Operating systems

Scripting Languages

Scripting languages are used for wiring together systems and applications at a very high level. They are almost always extremely expressive (they do a lot with very little code) and usually dynamic (the compiler does little, the run-time system does almost everything).

Esoteric Languages

An esoteric language is one not intended to be taken seriously. They can be jokes, near-minimalistic, or despotic (purposely obfuscated or non-deterministic).

Posted on Thursday, July 17, 2014 by Unknown

No comments




A programming language is a formal constructed language designed to communicate instructions to a machine, particularly a computer. Programming languages can be used to create programs to control the behavior of a machine or to express algorithms.

The description of a programming language is usually split into the two components of syntax (form) and semantics (meaning). Some languages are defined by a specification document (for example, the C programming language is specified by an ISO Standard), while other languages (such as Perl) have a dominant implementation that is treated as a reference.

Posted on Thursday, July 17, 2014 by Unknown

1 comment