Hacker News new | past | comments | ask | show | jobs | submit login

You can replace Python in Ruby or Lua for the most part and also Android with iOS. Dynamic languages aren't yet a big deal and I think it's because they don't bind well or give you a significantly better developer experience than Objective-C or Java on these platforms do.

RubyMotion is probably the closest I've seen to making a language like Ruby or Python a very compelling mobile developer experience while maintaining the end user experience.

That basically means you need to write a Python compiler for iOS or Android to make it compelling, which means adding type checking and at some point developers start asking why they aren't just using Java in the first place...




I like to add some more.

That's correct that the type system is the crucial attribute as a platform language. But that's not the only reason. It's also for correctness, safety, productivity and manageability.

That's because a platform is huge and complex monster. So all those properties are archive-able only by automated tools, and those tools need rich metadata for each word of code. Type information is crucial metadata, and it's mostly impossible to make high quality tools without those informations. That's why all the designed modern platform (=system) languages are all mostly typed. From C/C++/Objective-C, to Java, C#, Go, Rust, Dart, TypeScript…

In fact, it doesn't matter the language actually statically typed, dynamically typed, duck-typed, or completely untyped. The point is an ability to offer accurate metadata for automated tools, and type system is the best ever invented. So languages lacks the ability cannot be a platform's primary language.


Exactly. I've never understood why someone would prefer a untyped language. It's just a bad developing experience, code needs more debugging, it is harder to maintain and overall productivity is lower. Untyped languages are fine only for small scripting languages.


There's a difference between typed and statically typed. Python is a strongly typed language, but it is dynamic.

> bad developing experience, code needs more debugging, it is harder to maintain and overall productivity is lower

All of these observations are highly subjective.

> Developing experience:

I much prefer developing in python than java. If IDEs factor in, there are a number available for python, none of which I use, because I find a simple editor is usually enough.

> code needs more debugging

That's a function of the problem and the developer. The run, check, edit cycle in python is a lot quicker than using your IDE to run, check, edit, compile. There are debuggers available for nearly every language that allow you to step and inspect.

> harder to maintain

Disagree. When you have code 1/5th (number pulled from my ass) the size codebase, maintainability can be much better. Unit testing helps, regardless of the language.

> productivity

A developer proficient in language [X] should be just as productive as a different developer proficient in language [Y]. Creating a massive type hierarchy of classes and interfaces is a tonne of overhead when you are trying to express a simple idea. In a language like python, you might write a simple class (or two), and use dynamic typing appropriately. A java developer may use code generation and IDE shortcuts to lessen the amount of total code they have to write though.

The advantages and disadvantages of dynamic and statically typed languages are fairly well known. Neither is perfect all the time and for each person. Just because you don't like dynamic languages, it does not mean they don't have their virtues.


> I much prefer developing in python than java. If IDEs factor in, there are a number available for python, none of which I use, because I find a simple editor is usually enough.

Protip: Try pudb. It will blow your mind.


Working in C# right now and agonizing over the class hierarchy...


What does it even mean? You can dump everything into one class hacky Python way if you want. You can use dynamic keyword if you don't care about type safety.


We're not talking about monolithic, do-everything, hacky classes. You could do the same thing in C#.

Static languages (usually) force you to use interfaces and sub-typing just so common code can be reused. Duck typing is a much nicer way of working - without having to jump through seemingly unnecessary hoops. The situation is even worse when you code for testability. Things that really shouldn't have an interface now require one, so you can mock out the object appropriately. This is all avoided in a language with duck typing.

This is actually where Go has a great impact. You get your static type checking without being forced to use explicit interfaces. It essentially uses duck typing for interface implementation.

You're free to enjoy static typing over dynamic typing all you like - but you shouldn't make the mistake of thinking dynamic typing is inferior in any way. It is different - just like imperative vs functional is different. You make a series of tradeoffs, that is all.



Yes, but it's not idiomatic C#. The point I've been trying to make is that both kinds of languages (dynamic and static) have their different virtues. You can drop down to dynamic in C#, but unless you have a very good reason, your co-workers will lynch you.


Makes sense.


I love C#, but when moving between it and python, I find I focus too much on the types and hierarchies.


I used to be like this before I realized I used class inheritance way too much and shifted to using composition in the majority of cases.


And .NET framework tend to overuse subclassing/overriding pattern where delegation is more appropriate. That frequently leads users also overuse subclassing/overriding.


++ this, I moved from C# to Typescript and see that pattern change as being the biggest difference as to the code style i write.

I started Typescript by subclassing but as I work more and more with it, I subclass less and less, and delegate more and more.


Where's the best place to brush up on composition vs inheritance?


If you're familiar with C#, Real World Functional Programming: With Examples in F# and C# [1] is an excellent resource for learning how and when to use composition over inheritance.

[1]: http://www.amazon.com/Real-World-Functional-Programming-With...


I thought composition vs inheritance was strictly an OO, not a functional thing?


Functional programming usually leans towards composition, and OO programming towards inheritance. "Hybrid" functional languages, like F# and Scala, allow you to use both styles, mixing them in whatever way is most useful for the particular problem you're solving; "pure" functional languages, like Haskell don't offer OO-style inheritance, since composition is a better fit for combining side-effect free ("pure") functions.

tl;dr -- Composition is to functional programming as inheritance is to OO programming.


Yeesh, $40 for the ePub directly from Manning, or $33 from Amazon for the print edition with a free ePub.


There are a few chapters of the book online for free at MSDN, and if you're going to buy the book, they also have a coupon code: http://msdn.microsoft.com/en-us/library/vstudio/hh314518%28v...


I agree, any tool must be used correctly.


I doubt you really mean untyped languages. Untyped languages includes many assembly languages, BCPL and some Forths. It does not include typical scripting languages like Perl, Ruby or Python - all of which are strongly typed.


I don't know about Ruby but Perl certainly isn't strongly typed. You can run 'print "5.0" + 6' and get 11 as the answer. That's weak typing and types are implicitly converted to whatever.

Python is strongly typed only for the basic scalar types. With objects and classes there are just objects that may or may not have certain bound functions and attributes. Duck-typing is mostly perfectly sufficient since any errors do come out in practice, and there's no need for interfaces or classes as unique types, but what would be really helpful would be to have Clojure-like multimethods where dispatching of a function is itself a function of the arguments given in. That would be what would most alleviate the problems that arise from everything being a object() in Python.


Ruby is strongly typed. I am not sure that's static or dynamic, but regardless of how it is implemented, Ruby lacks ability to offer type information to code-writing level toolsets because it doesn't force retaining of type information on field and function parameters.

So regardless of whatever actually happens, to the tools, each Ruby function is just all dealing with unknown type parameter objects.

As a conclusion, Ruby has type, but has no way to utilize it. I think any other popular scripting language - such as Python, JS, Lua… are in same situation. V8 does speculative strong dynamic typing, but the those generated informations are completely useless to code-writing level tools.


And users of dynamically typed languages will sometimes argue that the need for code generation tools is less necessary. You lose the ability to have tools do a lot of the work, but you also lose the need to have tools do a lot of the work. It's a trade off.

Type [an]notations are also useful for compilers when generating performant code. But projects like pypy and V8 (javascript) show that a well written interpreter can do run-time analysis, and generate performant code, just like a static analysis.


Sorry, but what are the problems with everything being an object? That's a feature. Python 3.4 now has @singledispatch, though I don't know where I'd use that yet.


To be pedantic, assembly languages do generally have multiple types, if by 'type' we mean 'a set of values disjoint from other sets of values'. For example, x86 has the types integer, floating point, MMX, SSE, and registers of these types cannot be confused for each other. It's just that these classifications/types aren't so useful, and we can't make our own (and perhaps all we really wanted was a distinction between integer and pointer)


Historically they didnt, until hardware floating point wired up some registers to special hardware. Which is why C lets you cast; BCPL just has bit patterns.


Yeah, sorry, I should've used the term dynamic typing.


I think you originally intended explicit type notation. I recently discovered actual type doesn't matter that much, and the point is having an interface/protocol which enable compiler validation and tooling support. I learned this truth from Objective-C and Go. That's why I told actual typing system itself not important. Objective-C protocol is nothing about type, but defines nice interface for tooling support. Go interface defines set of promises, so actual object structure doesn't matter.

Furthermore, recent languages offer automatic type inference - Haskell, Go, C++11. They force to retain type information, but also permit to elide them where accurately inference-able/deduce-able.

This is completely different with not forcing type notion such as Python, Ruby, Lua, JS. In these languages, it is fundamentally impossible to track complete type information. But in explicitly type notated languages, it's possible to track complete type informations even they're elided.

I think those type-(notation)-less languages are making some efforts to offer type informations by adding annotations. But I don't think that's really meaningful, because that's not enforced, and community doesn't care much.



What about javascript?


Javascript is,

1. Dynamically strongly typed, but typing is limited to primitive types. 2. So actually it's untyped for objects which is really needs type information. 3. As it lacks class/interface concept at all, type (an)notation is fundamentally impossible, type tracking is also impossible. 4. So lacks ability to offer type information to toolset.

You don't have automated tooling support on Javascript about type, and it will degrade your productivity. So big companies interested on JS platform, are all offering JS with type notation -

1. Google = Dart, 2. MS = TypeScript, 3. Mozilla = Emscripten(in very unique way!)


Javascript is typically described as weakly typed; e.g., "2" + 2 is valid Javascript.


Are we really going to discuss this again?

> code needs more debugging, it is harder to maintain and overall productivity is lower

References?


I found one study[1] which concludes that unit tests are not enough to reveal all errors which would be revelead when using a statically typed language.

1. https://docs.google.com/file/d/0B5C1aVVb3qRONVhiNDBiNUw0am8/...


Isn't it why Python guys are trying to use more annotations? To patch it with some sort of semi-decent static analysis?

Check the Dropbox's pain presentation: https://www.dropbox.com/s/83ppa5iykqmr14z/Py2v3Hackers2013.p...


A small number of Python guys are trying to use more annotations for a sort of closer-to-staticly-typed Python. It's certainly not universal, or even a majority of developers.


Thats your opinion dude


If you think Python is a worse Java then there is something important you have missed.


If you think that programming languages are strictly worse or better than others on a linear scale, then there is something important you have missed.


Always use the correct tools for the Job. Pragmatism wins


thumbs up.


For doing useful things on Android? I don't think it can be reasonably disputed that Python is worse than Java.


"write a Python compiler for iOS or Android to make it compelling, which means adding type checking"

No

PyPy is a (JIT) compiler for Python, no type checking addition needed.


the point is adding type system, not compiling it.


Python has a type system.


Enforced type system is probably what he meant, i.e. annotations had a practical application at the moment.


I'm not just being purposefully obtuse here. When read in the context of his upstream comments, I don't think we can make any assumptions as to what he meant.


At least, what I mean was an ability to offer those type informations to automated tools - auto-completion system.

How actually Python type doesn't matter. Python lacks the ability by not forcing type (an)notation. This is fundamentally different with type-inferencing/deducing system such as Haskell, Go, C++11.


I think it's not "fundamentally different". Type info is there in the code, it's just much more implicit and requires much more work to extract and use. One thing which does just this is Jedi project (for python) and it's absolutely astonishing how much data you can get out of it!

Also I think that dynamic languages were meant to run inside a dynamic environment. For example in Pharo Smalltalk (probably all Smalltalks) every single piece of metadata is runtime data. Static analysis has no sense, because in Pharo there is no "static" at all - everything happens inside a living environment and (for example) as soon as you write a method it's turned into CompiledMethod object which has all the data about the method you would ever need for you to query easily. Good luck implementing better refactoring tools than those in Smalltalk for any other language.

Essentially the same approach is used in Emacs Lisp. For example, if you see a function you don't recognize, you can jump to it's definition. The thing here is that Emacs doesn't know where the definition is because of static analysis - it just has this compiled chunk of code in memory which happens to have a name you're looking for. This chunk of code knows a location of it's definition and many other pieces of metadata which are all available on runtime. It of course doesn't work if the function isn't already loaded into Emacs.

Most statically typed languages retain almost no type data in runtime. Most dynamically typed languages have almost no type data on compile time. I see this as largely equivalent.

So I guess what I want to say is that there is no fundamental difference in what the dynamic and static languages are, but there is (and should be) a very fundamental difference in how they are used. Choosing the between the two is I think almost exclusively a matter of preference. A good programmer should feel comfortable with both, though.


Lua can easily be compiled and used on both Android and iOS.


Or Lisp, which compiles and has optional type declarations

https://wukix.com/mocl


Isn't that just a subset of Common Lisp? It doesn't support runtime compilation, right?


the irony is the use of javascript in solutions like Titanium,to code ANYTHING but the ui ( the business logic is in javacript , the ui is native ). I like the approach though, it's a better approach than phonegaps and likes. the only drawback : doesnt seem the engine they use is opensource.


In Titanium, you code both the business logic and UI in Javascript. The Javascript UI APIs Titanium provides call native APIs under the hood.

Titanium is open source; the repository can be found on GitHub.


> at some point developers start asking why they aren't just using Java in the first place...

Hasn't held anyone back on non-mobile systems, right?


Those languages does not focus on portability.


The main focus of Lua has been portability.


into other languages, not systems.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: