Thursday, December 11, 2008

Powerful .NET Technologies

I have written this post to address several technologies available to dot net developers that I consider  exciting to work with. I will dwell into the details of each of the technologies in future posts. I do recommend that non dot net developers try to understand the rationale behind the technology and possibly find you dev environment equivalent or better still pioneer your own projects to add these into your platform. In this post I will give an introduction to four technologies in dot net namely, Language Integrated Query(LINQ), Windows Communication Foundation(WFC), Windows Presentation Foundation(WPF) and Windows Workflow Foundation WWF. I will go right ahead and give you a feel of what each is all about

LINQ

LINQ is Microsoft’s technology to provide a language-level support mechanism for querying data of all types. These types include in-memory arrays and collections, databases, XML documents, and more. Virtually any data store would make a good candidate for supporting LINQ queries. This includes databases, Microsoft’s Active Directory, the registry, the file system, an Excel file, and so on. LINQ offers a compact, expressive, and intelligible syntax for manipulating data. The real value of LINQ comes from its ability to apply the same query to an SQL database, a Dataset, an array of objects in memory or an XML file. LINQ requires the presence of specific language extensions.

LINQ uses an SQL-like syntax to make query expressions well beyond the capabilities of embedded SQL as implemented in programming languages. That's because embedded SQL uses a simplified, streamlined syntax to add SQL statements to other programming languages, where there's no attempt to integrate such statements into the native syntax and typing mechanisms. Thus, you can't invoke native language structures such as functions in embedded SQL statements, as you can using LINQ, because it is implemented to use native syntax, structures, and typing mechanisms. Furthermore, LINQ may be used to access all kinds of data, whereas embedded SQL is limited to addressing only databases that can handle SQL queries. Here is an example of using linq to SQL

 WCF

Web services, which uses standard protocols for application-to-application communication, have changed software development. The benefits of the changes in Web services should be reflected in the tools and technologies that developers use. Windows Communication Foundation  is designed to offer a manageable approach to distributed computing, broad interoperability, and direct support for service orientation.

WCF simplifies development of connected applications through a new service-oriented programming model. WCF supports many styles of distributed application development by providing a layered architecture. At its base, the WCF channel architecture provides asynchronous, untyped message-passing primitives. Built on top of this base are protocol facilities for secure, reliable, transacted data exchange and broad choice of transport and encoding options.

The typed programming model (called the service model) is designed to ease the development of distributed applications The service model features a straightforward mapping of Web services concepts to those of the .NET Framework common language runtime (CLR), including flexible and extensible mapping of messages to service implementations in languages such as Visual C# or Visual Basic. It includes serialization facilities that enable loose coupling and versioning

With WCF, distributed applications are much easier to implement, due to the following facts

           Because WCF can communicate using Web services, interoperability with other platforms that also support SOAP, such as the leading J2EE-based application servers, is straightforward.

  • You can also configure and extend WCF to communicate with Web services using messages not based on SOAP, for example, simple XML formats like RSS. 
  • Performance is of paramount concern for most businesses. WCF is developed with the goal of being one of the fastest distributed application platform developed by Microsoft
  • To allow optimal performance when both parties in a communication are built on WCF, the wire encoding used in this case is an optimized binary version of an XML Information Set. Messages still conform to the data structure of a SOAP message, but their encoding uses a binary representation of that data structure rather than the standard angle-brackets-and-text format of the XML 1.0 text encoding. Using this option makes sense for communicating with the call center client application, because it is also built on WCF, and performance is an important concern.
  • Managing object lifetimes, defining distributed transactions, and other aspects of Enterprise Services are now provided by WCF. They are available to any WCF-based application, which means that your application can use them with any of the other applications it communicates with.
  • Because it supports a large set of the WS-* specifications, WCF helps provide reliability, security, and transactions when communicating with any platform that also supports these specifications.
  • The WCF option for queued messaging, built on Message Queuing, allows applications to use persistent queuing without using another set of application programming interfaces.

The result of this unification is greater functionality and significantly reduced complexity.

  WWF

Most businesses require processes to function properly. There are different types of processes. Some processes are human-intensive, others machine-intensive, and the last type is a combination of the first two. Some examples of business processes are payroll, new product introductions, new employee hiring, etc. In most cases, these business processes require intervention from multiple entities and thus, are normally long running.  Workflow is one of the mechanisms used by businesses to express their business processes as a series of self-contained activities. Business Process Management (BPM) systems provided an environment for developers to create, execute, and manage workflows. These workflows are normally expressed using Finite State Machine (FSM), Unified Modeling Language (UML) Activity Diagrams, UML Swim Lanes, or Flow Charts. WF technology complements the .NET Framework with a group of workflow-related components that allow developers the ability to define, compile, instantiate, debug, and track workflows. This technology will become part of WinFX together with Windows Presentation Foundation, and Windows Communication Foundation.

WF workflows are composed using activities. Activities represent discreet pieces of functionality that are used to run specific business activities. There are two types of activities: composite and individual activities. Composite activities are used to express control statements (i.e., While, For, If-Then-Else, Case, etc.) and for grouping activities that share behavior (i.e., Sequences, Conditioned Activity Groups, etc.). Also, these activities are used to develop reusable sub-processes or sub-workflows. On the opposite side, individual activities provide a mechanism for expressing single pieces of work that need to be executed in the same step in the workflow.The workflow run time is responsible for taking workflow definitions and instantiating them. The life cycle of the workflow instances are managed by the workflow runtime. It is responsible for creating, executing, threading, persisting, tracking, communicating execution events, and coordinating transactions. These functions are managed by the workflow run time via services. There is a set of default services that the run time uses to manage all of its workflow instances, threading, transactions, tracking, state management, etc. Application developers responsible for integrating workflows into their existing applications can overwrite these services. Using this service model, developers are able to expose their existing hosting infrastructure to the workflow library. The framework provides a set of out-of-box services that allow developers to quickly start using the environment without worrying about having to write complicated code.

WPF 

Windows Presentation Foundation is a development tool for Web applications and rich client applications. With WPF, developers can use XAML, the Extensible Application Markup Language, to create custom controls, graphics, 3D images and animations that are not available in traditional HTML implementations.

 

 

 

 

Wednesday, December 10, 2008

Google Chrome 'coming out of beta'

Posted by Stephen Shankland

Google's Chrome Web browser is coming out of beta testing, according to a TechCrunch report Wednesday.

Marissa Mayer, Google's vice president of user experience, told TechCrunch's Mike Arrington as much in an interview at Le Web 08, according to the report. However, there was no word about when the move might take place.

One possibility would be to announce it Thursday at Add-on-Con, a conference about browser extensions at which Nick Baum, a product manager on Google Chrome, is scheduled to speak on a panel about the future of Web browsers. Also on the panel are Joshua Allen, senior technical evangelist for Microsoft's Internet Explorer, and Mike Shaver, vice president of engineering for Firefox builder Mozilla.

Taking the browser out of beta would doubtless fulfill Google's ambition to let business partners, such as computer makers, bundle Chrome on their systems. Google launched the first beta version in September.

However, Chrome is still rough around the edges to be a version 1.0 product. New Chrome developer releases arrive frequently to stamp out bugs. Hotmail only works with Chrome if users launch it with a particular command-line option to fool Microsoft's e-mail site into thinking it's not using Chrome. And at least for me, even Google's own Google's Zeitgeist 2008 Web site doesn't work properly in Chrome: the country-specific pop-ups are cut off at the bottom of the browser view. (The same pop-up issue arises in Internet Explorer and Safari, but not in Firefox.)

Also, although Chrome has been in development internally at Google for years, it's curious that the company would take Chrome out of beta when it's resisted the impulse to do the same with Gmail and several other high-profile projects.

Chrome works only on Windows for now, though Google is working on a Mac version and a Linux version.

Google didn't immediately respond to a request for comment.

Separately, Arrington reported that Mayer said Google plans to include an option in the first quarter of 2009 to turn off the new SearchWiki feature, which lets people customize their own search results.

Stephen Shankland covers Google, Yahoo, search, online advertising, portals, digital photography, and related subjects. He joined CNET News in 1998 and since then also has covered servers, supercomputing, open-source software, and science. E-mail Stephen.

Friday, December 5, 2008

Web Applications and Websites - a sneak into the future

For those of you who are into web development fulltime, here is a sneak into the future of websites. You all know how annoying it is to remember all those passwords you use to login to websites like gmail,yahoo,facebook,hi5,tagged,myspace,aol and so on. Well, the solution is right here.


What is OpenID?

OpenID eliminates the need for multiple usernames across different websites, simplifying your online experience.

You get to choose the OpenID Provider that best meets your needs and most importantly that you trust. At the same time, your OpenID can stay with you, no matter which Provider you move to. And best of all, the OpenID technology is not proprietary and is completely free.

For businesses, this means a lower cost of password and account management, while drawing new web traffic. OpenID lowers user frustration by letting users have control of their login.

For geeks, OpenID is an open, decentralized, free framework for user-centric digital identity. OpenID takes advantage of already existing internet technology (URI, HTTP, SSL, Diffie-Hellman) and realizes that people are already creating identities for themselves whether it be at their blog, photostream, profile page, etc. With OpenID you can easily transform one of these existing URIs into an account which can be used at sites which support OpenID logins.

OpenID is still in the adoption phase and is becoming more and more popular, as large organizations like AOL, Microsoft, Sun, Novell, etc. begin to accept and provide OpenIDs. Today it is estimated that there are over 160-million OpenID enabled URIs with nearly ten-thousand sites supporting OpenID logins.

Who Owns or Controls OpenID?

OpenID has arisen from the open source community to solve the problems that could not be easily solved by other existing technologies. OpenID is a lightweight method of identifying individuals that uses the same technology framework that is used to identify websites. As such, OpenID is not owned by anyone, nor should it be. Today, anyone can choose to be an OpenID user or an OpenID Provider for free without having to register or be approved by any organization.

The OpenID Foundation was formed to assist the open source model by providing a legal entity to be the steward for the community by providing needed infrastructure and generally helping to promote and support expanded adoption of OpenID.

As Brad Fitzpatrick (the father of OpenID) said, “Nobody should own this. Nobody’s planning on making any money from this. The goal is to release every part of this under the most liberal licenses possible, so there’s no money or licensing or registering required to play. It benefits the community as a whole if something like this exists, and we’re all a part of the community.”

More About this on http://openid.net/what/

Thursday, December 4, 2008

DevCore's Summer Of Code - Competition!!! Competition!! Competition!!


Anagrams
are words that contain the same letters, not necessarily in the same order.

For example, "loop", "pool", and "polo" are all anagrams of each other, because each contains one "l", two "o"s,
and one "p".

Any word is considered to be an anagram of itself.

The task at hand is to come up with code in a language of your own choice that tests whether two strings are anagrams of each other.

The rules of the game are as follows:

1) Your source code must consist of only a function definition (in the case of procedural languages), or a method definition (in the case of object oriented languages) in a form that may be similar to the template to be given below.
2) the function or method must only take two parameters or arguments of data type string, or two string arrays, or pointers to string arrays.
3) It must return a boolean or equivalent data type that represents a true of false state meaning that the two strings passed are anagrams of each other or not.
4) The body of the method can contain any number of lines of code, but the definition of the method must be overally be similar to the following

public boolean areAnagramsOfEachOther(String string1,String string2){
//Method body

//
}

The GOAL of this competition is to be able to come up with an ALGORITHM that will stand out not only as the best, but also the most OPTIMIZED code which will be qualified by such attributes as:
a) Minimal code redundancy and repetition
b) Fewer lines of code
c) Execution speed and usage of resources ie. The code that will essentially execute faster and use less resources for it's execution.
d) The code that has less memory leaks if any, or any hidden bugs.

The WINNER gets the exclusive right to be PROMOTED to be an ADMINISTRATOR, with non-restrictive permissions to our BLOG DEVCORE.BLOGSPOT.COM.

C'mon guys, I know we all got pesky work to do, but this is a refreshing challenge.


Monday, December 1, 2008

SOA and Web Services

An SOA consists of a set of resources on a network that are made available as independent services, and that can be accessed without requiring any knowledge of how they are implemented. You can combine the services in an SOA to create an enterprise application. I will not consider the full theory of SOA, but the main benefits are that it enables you to create complex solutions that are independent of any specific platform and location. This means that you can quickly replace or upgrade a service or move a service to a different site (possibly running on faster hardware), and as long as the service exposes the same interfaces as before, you can continue to use it without needing to modify any code. However, SOA is not a magic wand that will instantly solve all of your distributed application architecture problems. To successfully design and implement an SOA, you should be aware of what has become known as the “Four Tenets of Service Orientation.” These are:

  1. Boundaries are explicit. Applications and services communicate by sending messages to each other. You should not make any assumptions about how a service processes a request or how a client application handles any response to a request. Following this principle can help to remove dependencies between services and client applications. Additionally, sending and receiving messages has an associated cost in terms of communications. You should design the operations that services implement with this in mind, and ensure that clients call services only when necessary.
  2. Services are autonomous. If you are building an application based on services, you might not have control over every service you are using, especially Web services hosted outside of your organization. The location of a Web service might change, or a service might be temporarily taken off-line for maintenance or other reasons. You should design your solutions to be loosely coupled, so that they can tolerate these changes and continue running even if one or more services are unavailable.
  3. Services share schemas and contracts, not classes or types. Services publish information about the operations that they implement and the structure of the data that they expect to send and receive. Clients use this information when communicating with the service. You should design contracts and schemas to define the interfaces that your services expose. This can reduce the dependencies that clients have on a particular version of your services. Services can change and evolve over time, and a new version of a service might appear superseding a previous version. If a service is updated, it should maintain compatibility with existing clients by continuing to implement existing contracts and send messages that conform to existing schemas.
  4. Compatibility is based on policy. The schemas and contracts exposed by a service define the “shape” of the service but not the nonfunctional requirements that a client attempting to access the service must fulfill. For example, a service might have security requirements that state that clients must connect to it in a particular manner and send and receive messages by encrypting data in a specific way. This is an example of policy. The policy requirements of a service cannot be specified by using contracts and should not require additional coding on the part of the client or the service–these requirements might change over time and so should be decoupled from the implementation of the service and clients. You should design services so that their policy requirements are independent of any implementation, and you should enforce clients to abide by any policies required by the service. Additionally, all services and client applications must agree on how to specify this policy information (typically by using some sort of configuration file). This is the purpose of the WS-Policy framework, published by the World Wide Web Consortium, and widely adopted by Web service developers.
If time allows, i might provide a walk throught of implementing SOA using WFC. Essentially the ideas behind the implemetation are the same, its a simple conversion from c# to java

 

Shoko is In....

Guys, i am finally part of the family. Thanks to Ritz.

Web services

Guys i intend to do web services in java basically i have an idea that its about the Service Oriented Architecture(SOA), i would welcome suggestions on the SOA and or anything about web services be it in .Net or Java but preferably Java

thanx guys

Web apps vs Desktop Apps

I think figuring out which is better between web-app & desktop app depends strongly on the deployment environment. I think in the end, it is a balance between control on one end, and standards at the other end, I'l explain.

If you have a lot of control over your deployment environment (ie, you know it's going to be 100% windows vista, or 100% Java6 or whatever platform of your choice) - you can choose to develop a solution as a desktop application.

If, however, you have little control over the deployment environment, web applications would be more relevant, as they cater to the least common denominator (valid HTML over HTTP at the least). Because of these standards, your application doesn't care (or shouldn't) whether the user is using IE7 on XP, Firefox on Mac, Opera Mini on a mobile phone or IE5 on Win98.

Some of the disadvantages of web-apps are not always present, for instance speed/availability when the server is running on a local LAN.

My favourite advantages of web applications
1.Deploying a web application is trivial - simply dragging a shortcut to desktop
2.Allows for more fluid deployment cycles - incremental development is effortless

I'm going to turn the security argument on it's head: say you want to convert a WMV video clip to a MPG format: Would you rather a) Download an video_coverter.exe executable by some guy in Russia or b) upload it to a video conversion website & download the converted file? I know my example is a bit contrived, but i wanted to show that the advantages of web-app/desktop-app depend on the scenario.

There are certain things web-apps can't do, obviously (like Printing & accessing local files in a directly). But for me, if there is a problem that can be equally solved by a desktop app or a web-app, i'l choose the web-app every time

Concerning  web and desktop apps I also find web services a lot compelling. Talking from a dot net point of view, there is a technology called WCF (windows communication foundation), which is replacing web services in .net. WCF is an ideal way of developing distributed applications(web or desktop apps). You can develop a wcf service that you host and is accessible to virtually all types of applications. In other words you can develop a wcf service that you can host on a web server, a custom host application or a windows service that you can  access  from a browser application  or a desktop app independent of the platform you are using(of course platform independence  depends on the specific endpoint bindings you use). You can change the implementation of your service and need not to worry about the clients as long as you interfaces remain the same. 

Web apps vs Desktop apps

Well, my view is that we cannot be able to accurately point the better technology here. It all dwells down to the type of the application you are developing. Here are a number of points to take note of when deciding which type of application is best.

1.       Speed- if you are developing applications that need high response speeds like need for speed then certainly the web solution is not the way to go (for now, we might have very fast connections in the future). For normal applications though the web solution is more of a better option for the same reasons mentioned by James. The issue of speed might not be that consequential in your application if you make use of AJAX(a topic I suggest we might need to talk about)

2.       Security- with internet comes more risk and how much you can tolerate depends on the security  requirements of  you applications and whether you are going to use the internet for the web solution or an intranet in which case the risk  is lower

3.       Accessibility- if the users of the system are constantly connected then web is ideal, on the contrary desktop applications are more suitable for offline working and are targeted at a specific group of users in an organization. WEB applications deployed over the web are normally accessibly to anyone with internet connections hence the audience of the application play a vital role in deciding your implementation

4.       Reliability- application with critical safety and reliability implications are normally deployed as desktop apps, again due to the high risk and unreliable connectivity of the internet

These are just view points to take note of, in my own view I see more general applications being deployed as web applications while customized applications will be deployed as web based intranet solutions. The high speed and high business coupled apps will remain the desktop domain. I can safety say both solutions are here to stay for a while; it might be of great use to be comfortable with both. For .net and java folks all is well because you practically use the same environment for both applications.

 

web Vs windows applications

Interesting gentleman......

The reason i asked that question was that here we have found a "solution" to the major drawback of windows apps
1) With . Net there is way of deployment called the click once method ... its really cool coz all u do is deploy your application on the server and all client machines can access the exe on the server via a browser. on accessing it u then deploy/install it on your local machine.its got various options like checking updates before every run u do on your local macahine, if another version has been instaaled on the server it asks or forces u to run the latest version.. this removes the problem of doing deployments on every machine, evryone is guaranteed of running the same version etc etc...i can go on and on
also I think as far as ease or speed development and security its more to do with the kind of application you are doing and what you are comfotrable with but mostly the design

Lets talk gentleman

Sunday, November 30, 2008

Web Vs Desktop application continued

Well this sounds very interesting, believe u me guys im learning on every post u send on this blog keep up the spirit. I also believe web applications are a bit easier to implement in terms of development considering the likes of myself(you all know me) managed to develop a web application for my company that is currently in use and has been adopted by mgt as a good system. I am happy about that.

Points

1. The speed, actually there are two ports that you can use for deploying a web or Enterprise application namely http port 8080 which is relatively slower and less secure compared to https port 8181. specifically to my Glassfish application server i dont need to manually set these ports during deployment i just need to provide the https://marleyz:8181/project link to my users. I have tried this and there is a difference, i also believe the difference is noticeable if pulling large amts of data

2. Ease of development, whoever is behind Netbeans i salute you........... The servlets and JSPs are easy to develop in netbeans and that allows rapid application development. I can give these two links just in case you are interested in trying http://www.netbeans.org/kb/trails/web.html

Hope my garbage wont bug the smart

Web versus Desktop Applications

This is something I picked up from the web. But the truth is, the issue is has been the subject of debate for a long time

Pros and Cons to Desktop and Web Applications:

Easily Accessible
Web applications can be easily accessed from any computer or location that has Internet access. Travelers especially benefit from the accessibility. This often means that if a traveler has access to a computer, phone or handheld with Internet connectivity they can utilize the web application.


Low Maintenance & Forced Upgrades
Desktop applications need to be individually installed on each computer, while web applications require a single installation.
Many web applications are hosted by a 3rd party and the maintenance fall under the applications hosts responsibility. The ability to update and maintain web applications without distributing and installing software on potentially thousands of client computers is a key reason for the popularity of web based applications. This can be a blessing and a curse as users of web applications on hosted systems are at the mercy of the host, if an upgrade does not go well, or the individual user doesn't want or need the new features the upgrade will still go forward.

Increased Security Risks
There are always risks involved when dealing with working online, regardless of how secure a host might say a web application is, that fact of the matter stands that the security risk of running an application of the Internet is more significant than when running an application on a standalone desktop computer. Some applications require more security than others, playing Sudoku on a web application would cause little concern, but dealing with sensitive corporate formulas or accounting details in a web environment might be determined risky.


Cost
Over the life of the software use, web applications are typically significantly more expensive over time. Desktop applications are purchased outright and rarely is their a recurring fee for the software use. Some desktop applications do have maintenance fees or fee based upgrades associated with them, but rarely is there a subscription fee associated with the software's ongoing use.

Many corporate web applications use a different model, users typically are charged monthly service fee to operate the software. Fees are considered "subscription fees". If you fail to renew your subscription you may be unable to access the data stored in the web application.


Connectivity
Web applications rely on persistent and unmanaged connectivity. If you do not have an Internet connection or if your host does not have Internet connectivity you cannot access the information. Critical applications or businesses that are time sensitive cannot risk denial of service attacks or power outages to interrupt their operations and access data that is sensitive.


Slower
Web applications that rely on the Internet to transfer data rather than a computer's local hard drive, may operate slower. The speed may also vary based on number of users accessing the application.

Backups & Ownership.
Regardless of the platform, companies need to be sure that their data is appropriately backed up. When using a web application that are hosted by a third party, companies should clearly determine who owns the data housed in the application, and be sure that privacy policies prevent that data from being used by the web host.


Ultimately the accessibility of web based applications make them very desirable। Web applications have some fundamental limitations in their functionality, and are better suited for specific tasks. Understanding the pro's and con's to each business model, will help users determine whether a desktop application or web application will better suit their needs.

I don't know fellas... What do you reckon is the better technology? Desktop or web?

Web Apllications

Hi guys 
This is my first time to make a post on this blog coz i kept on failing to register for some reason
Ritz thanx for the inititative
I know some of us guys are not involved in web applications but i am sure we have 1 or 2 to contribute. i have taken note of taps, dirk and ritz' comments on web applications....
sorry to take u back guys but what do u think is/are the MAIN advantages of web application over the windows ones.... (seriously its not trivial)

Hilarious and yet true

1) Project Manager is a person who thinks nine women can deliver a baby in One month.

2) Developer is a person who thinks it will take 18 months to deliver a Baby.

3) Onsite Coordinator is one who thinks single woman can deliver nine babies in one month.

4) Client is the one who doesn't know why he wants a baby.

5) Marketing Manager is a person who thinks he can deliver a baby even if no man and woman are available.

6) Resource Optimization Team thinks they don't need a man or woman; they'll produce a child with zero resources.

7) Documentation Team thinks they don't care whether the child is delivered, they'll just document 9 months.

8) Quality Auditor is the person who is never happy with the PROCESS to Produce a baby.

9) Tester is a person who always tells his wife that this is not the Right baby

10) Team Lead is a person actually knows how many men and women required to deliver the baby , but will not tell anyone

Thursday, November 27, 2008

Web Applications in my world "Client Side"

I thought it best to write 2 seperate articles:the first (this one) focuses on the client-side (browser),and the second on the
server side (which for me, would mean PHP)

In an ideal world,web developers would be distinct from web designers.we are far from ideal and i normally find myself doing design.the tools in my arsenal include Dreamweaver,jQuery, Firefox + Webdeveloper,Firebug,Yslow plugins and Aptana studio.

i normally start off with Dreamweaver for UI design in plain HTML,switch to Aptana to incrementally fill in business logic (thankfully,php work well as a with R.A.D. language).

I occasionally have to handcraft Javascript,which could be a pain to debug without the above-mentioned firefox plugins (alert(),anybody?)

Ajax,as a technology, has it's share of critics,but if you call yourself a web developer and haven't started using it,you need to get with the program.You can choose one of many libraries and stick with it,my weapon of choice is jQuery.

Ajax has ramifications on the user experience,i think GMail has been used as an example too many times,so I'll pose a generic scenario. Suppose you have a table/grid with 1 729 rows,and the user deletes a single row- wouldn't it be too drastic, to reload the entire page just to remove single record? An async call without a page reload would be more efficient,and it's easier than you think.

Incidentally, Microsoft will be shipping jQuery with Visual Studio,for those who develop in .NET

Debugging an ajax app is similar to debugging a multi-threaded program- it can be a pain, but with the right tool set,it can be done.More so with proper planning and keeping things simple.

If there's one thing i hate on client side - it is tweaking CSS for IE to make pages render correctly.i design with Firefox in mind.If the application is for an environment i have control over (such as client LAN), i don't bother with IE support,instead,I install Firefox on every box.

Web applications in my world "LAMP stack"

In this here my 2nd post, I'll be sharing my views on Web Applications under my environment- the LAMP (Linux,Apache,MySQL,Php)

PHP has its humble roots going back to 1995,when it was invented as a 'Personal Homepage Parser' - that moniker has been since dropped and PHP doesn't stand for anything in particular anymore. Originally,it followed the CGI script paradigm where each page request would launch a seperate heavy-weight process.whilst this is still possible,nowadays it more common for php to be running inside the server as a plugin (ISAPI plugin under IIS,mod_php under Apache).this means said requests may be handled by light weight threads,without the expensive startup stage for each launch.

Whilst PHP is a scripted language,byte-code can be generated using a vendor-specific engine,such as the Zend Optimizer.

php has similar concepts to other web application frameworks - application state is stored in either session or request 'objects' (i use the term loosely,they are technically arrays)

PHP is rather flexible,it does not impose a 'php-way' of doing things.as a result,there are a number of frameworks based on PHP,some of which are rather specialised,but still extensible (Wordpress for blogs,Joomla and Xoops for content management).

There are other,real frameworks that run on PHP including some that support scaffolding (automatic code generation).For some reason,most of them (if not all) use the MVC (Model-Viewer-Controller) design pattern.

As a scripted language,there is no special 'build' or 'deploy' stages for PHP.that means deployment might be unzipping files onto your production server - which is a terrible way of doing it, by the way.

I tend to use a Subversion repository to manage my project assets and change management,once I'm happy with the testing (or more often,the deadline has approached),i deploy from the repository to production server.Apache's mod_rewrite and .htaccess files ensure only legitimate php files are served,and nothing else that could be a security risk.

mod_rewrite is quite useful for another reason - as It's name suggests,it rewrites http requests on the fly.the most common use is to url prettification.a url like http://eg.com/user/jane/photo can be mapped to http://eg.com/main.php?mod=user&id=jane&section=photo .

PHP is not without weaknesses- at times CPU-intensive processes time out,the default setting in the php.ini file sets that to 30 seconds,which is not adequate for the generation of complex reports.i use the 'Linux' part of the stack to bypass this,I schedule a script to be launched in its own shell using cron then cache the result (eg a PDF report). A PHP script launched in this way can execute at it's leisure, as it has no execution time restrictions.

That's about it for Web apps under LAMP.

Web Applications in my world "ASP.net"

ASP.NET applications

The difference between web applications and rich client applications makes a lot of sense when you consider the ASP.NET execution model. Unlike a rich client application, the user never runs an ASP.NET application directly or any web application for that matter. Instead, a user launches a browser and requests a specific URL over HTTP. This request is received by a web server. The web server has no concept of separate application; it simply passes the request to the ASP.NET process. However, the ASP.NET process carefully separates code execution into different application domains based on the virtual directory. Web pages that are hosted in the same virtual directory (or one of its subdirectories) execute in the same application domain

The Application Domain

An application domain is a boundary enforced by the CLR(dot net runtime engine) that ensures that one application can’t influence another. The following characteristics are a direct result of the application domain model:

1.  All the web pages in a single web application share the same in-memory resources, such as global application data, per-user session data, and cached data. 

 2.  All the web pages in a single web application share the same core configuration settings. However, you can customize some configuration settings in individual subdirectories of the same virtual directory. For example, you can set only one authentication mechanism for a web application, no matter how many subdirectories it has. However, you can set different authorization rules in each directory to fine-tune who is allowed to access different groups of pages.

 3.   All web applications raise global application events at various stages (when the application domain is first created, when it’s destroyed). You can attach event handlers that react to these global application events using code in the global.asax  file in your application’s virtual directory.

 Essentially, the virtual directory is the basic grouping structure that defines an ASP.NET application.  ASP.NET applications can include files such as: 

1.  Web pages (.aspx files)

2. Web services (.asmx files):

3. Code-behind files: If you are using the code behind  model, you may have separate source code files.

4.  A configuration file (web.config): This file contains application-level settings that configure everything including security, debugging and state management.

5. global.asax: This file contains event handlers that react to global application events (such as when the application is first being started).

6. Other components: These are compiled assemblies that contain separate components  with useful functionality. Components allow you to separate business and data access logic and create custom controls.

7.  Other files and resources that ASP.NET web applications will use, including stylesheets, images, XML files

 Application Lifetime

ASP.NET uses a lazy initialization technique for creating application domains. This means that the application domain for a web application is created the first time a request is received for a page in that application. An application domain can shut down for a variety of reasons, including if the web server itself shuts down. But, more commonly, applications restart themselves in new application domains in response to error conditions or configuration changes. This model is designed to keep an application healthy and to detect characteristics that could indicate a problem has developed or performance of the application has degraded (such as a long queue of outstanding requests, a huge amount of memory in use, and so on). Depending on your machine.config settings, application domains may be recycled based on the length of time the application domain has been running, the number of queued requests, or the amount of memory used . ASP.NET automatically recycles application domains when you change the application. For example if you modify the web.config file. Another example is if you replace an existing web-page file or DLL assembly file. In both of these cases, ASP.NET starts a new application domain to handle all future requests and keeps the existing application domain alive long enough to finish handling any outstanding requests (including queued  requests).

 Application Updates

 One of the most remarkable features about the ASP.NET execution model is that you can update your web application without needing to restart the web server and without worrying about affecting existing clients. This means you can add, replace, or delete files in the virtual directory at any time. Being able to update any part of an application at any time without interrupting existing requests is a powerful feature. However, it’s important to understand the architecture that makes it possible. It’s not  a feature of the CLR that allows ASP.NET to seamlessly transition to a new application domain. In reality, the CLR always locks assembly files when it executes them. To get around this limitation, ASP.NET doesn’t actually use the ASP.NET files in the virtual directory. Instead, it uses another technique, called shadow copy, during the compilation process it creates  copies of your files .The ASP.NET worker process loads the assemblies from this directory, which means these assemblies are locked. The second part of the story is ASP.NET’s ability to detect when you change the original files , it simply relies on the ability of the Windows operating systemto track directories and files and send immediate change notifications. ASP.NET maintains an active list of all assemblies loaded within a particular application’s application domain and uses monitoring code to watch for changes and acts accordingly.

 Application Directory Structure

Every web application should have a well-planned directory structure. Independently from the directory structure you design, ASP.NET defines a few directories with special meanings:

Bin

This directory contains all the precompiled .NET assemblies (DLLs) that the ASP.NET web application uses. These assemblies can include precompiled web-page classes, as well as other assemblies referenced by these classes.

App_Code

This directory contains source code files that are dynamically compiled for use in your application. These code files are usually separate components, such as a logging component or a data access library. 

App_GlobalResources

This directory stores global resources that are accessible to every page in the web application.

Web Applications in My World - "JSP/J2EE/SERVLETS"














From the picture above you can see the overall architecture or models on which most web applications that are done in J2EE look like.

As you can see, a normal web app done in J2ee consists of jsp web pages, servlets if necessary, and data stores. Ideally all of these should be present, but it always depends on the type and complexity of the web app i do. In other words, a web app could contain all of these or just some, but not all.

The anatomy of the web application
In web applications developed using jsp and j2ee technologies (please refer to diagram) we see the following major highlights about the architecture of web applications.
1) Most users use the web browser as a client to the web enabled applications. This could be any browser from Mozilla Firefox, Internet Explorer, Safari, Opera, Netscape, or any other browser that can send and recieve HTTP and HTTPS requests and responses. Basically this means that even though you are on linux or windows or macintosh or solaris, any browser can access the webserver and get pages. In addition, (which has been ommited on the diagram) we could also have WAP enabled devices, those that can browse WAP applications based on WML, which is a form of Markup language for WAP devices like cellphones to allow clients to browse websites from their mobile devices like phones, pda's and so on.
2) The web server is a software program normally referred to in java terms as web container, because it contains and can handle, manage, process and server jsp pages and servlet classes.
3) The web server can also host java beans which are standard java classes that can access databases, file systems, and can do all that any ordinary java program can do, including networking via eSockets and XML processing.
4) The Enterprise tier consists of EJBs (Enterprise JAVA Beans), which are a specialized JAVA classes that are supposed to make connecting to databases as easy as drag and drop, and you don't need to write any sql queries or such thing. For more info, read books on EJBs coz i can't cover them all here.

Now the most interesting part of web applications is the webserver.
The browser sends an HTTP or HTTPS request to the webserver, from any location on the internet. WAP devices like mobile phones can also send WAP requests to the webserver.
Upon reciept of a request the webserver invokes the servlet class specified from the browser to handle the request. (Note that the request string contains information of which servlet to invoke). The servlet class, which is just a java class, is simply written by the programmer in simple JAVA code, and can do anything from accessing databases, instantiating other classes (inbuilt into the java runtime, or custom made by the programmer), access files in the local filesystem, XML files, network devices via sockets and so on, as well as either redirecting the request to another servlet class, jsp page, or returning a response back to the browser.
The servlet class takes the request string from the browser or WAP request, and the programmer extracts the various request parameters like form data, links, user agent, user infomation and so on. The programmer can then use this infromation to either process data, make calculations, or perhaps even query a database from the servlet class.
When done, the servlet either redirects the request to another servlet, or it renders HTML directly to a browser, or it can invoke a jsp page, which consists of java code mixed with HTML. Either way, a response is sent to the client (either browser or WAP device).
Besides just servicing HTTP requests, and WAP requests, servlets can also serve other protocols of the TCP/IP protocol stack.

More in the next post........